Jan 26 14:04:39 localhost kernel: Linux version 5.14.0-661.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-69.el9) #1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026
Jan 26 14:04:39 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Jan 26 14:04:39 localhost kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 26 14:04:39 localhost kernel: BIOS-provided physical RAM map:
Jan 26 14:04:39 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Jan 26 14:04:39 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Jan 26 14:04:39 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Jan 26 14:04:39 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Jan 26 14:04:39 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Jan 26 14:04:39 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Jan 26 14:04:39 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Jan 26 14:04:39 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Jan 26 14:04:39 localhost kernel: NX (Execute Disable) protection: active
Jan 26 14:04:39 localhost kernel: APIC: Static calls initialized
Jan 26 14:04:39 localhost kernel: SMBIOS 2.8 present.
Jan 26 14:04:39 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Jan 26 14:04:39 localhost kernel: Hypervisor detected: KVM
Jan 26 14:04:39 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Jan 26 14:04:39 localhost kernel: kvm-clock: using sched offset of 3370523903 cycles
Jan 26 14:04:39 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Jan 26 14:04:39 localhost kernel: tsc: Detected 2799.998 MHz processor
Jan 26 14:04:39 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Jan 26 14:04:39 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Jan 26 14:04:39 localhost kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Jan 26 14:04:39 localhost kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Jan 26 14:04:39 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Jan 26 14:04:39 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Jan 26 14:04:39 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Jan 26 14:04:39 localhost kernel: Using GB pages for direct mapping
Jan 26 14:04:39 localhost kernel: RAMDISK: [mem 0x2d426000-0x32a0afff]
Jan 26 14:04:39 localhost kernel: ACPI: Early table checksum verification disabled
Jan 26 14:04:39 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Jan 26 14:04:39 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 26 14:04:39 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 26 14:04:39 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 26 14:04:39 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Jan 26 14:04:39 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 26 14:04:39 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 26 14:04:39 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Jan 26 14:04:39 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Jan 26 14:04:39 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Jan 26 14:04:39 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Jan 26 14:04:39 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Jan 26 14:04:39 localhost kernel: No NUMA configuration found
Jan 26 14:04:39 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Jan 26 14:04:39 localhost kernel: NODE_DATA(0) allocated [mem 0x23ffd5000-0x23fffffff]
Jan 26 14:04:39 localhost kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Jan 26 14:04:39 localhost kernel: Zone ranges:
Jan 26 14:04:39 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Jan 26 14:04:39 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Jan 26 14:04:39 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000023fffffff]
Jan 26 14:04:39 localhost kernel:   Device   empty
Jan 26 14:04:39 localhost kernel: Movable zone start for each node
Jan 26 14:04:39 localhost kernel: Early memory node ranges
Jan 26 14:04:39 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Jan 26 14:04:39 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Jan 26 14:04:39 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000023fffffff]
Jan 26 14:04:39 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Jan 26 14:04:39 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Jan 26 14:04:39 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Jan 26 14:04:39 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Jan 26 14:04:39 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Jan 26 14:04:39 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Jan 26 14:04:39 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Jan 26 14:04:39 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Jan 26 14:04:39 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Jan 26 14:04:39 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Jan 26 14:04:39 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Jan 26 14:04:39 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Jan 26 14:04:39 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Jan 26 14:04:39 localhost kernel: TSC deadline timer available
Jan 26 14:04:39 localhost kernel: CPU topo: Max. logical packages:   8
Jan 26 14:04:39 localhost kernel: CPU topo: Max. logical dies:       8
Jan 26 14:04:39 localhost kernel: CPU topo: Max. dies per package:   1
Jan 26 14:04:39 localhost kernel: CPU topo: Max. threads per core:   1
Jan 26 14:04:39 localhost kernel: CPU topo: Num. cores per package:     1
Jan 26 14:04:39 localhost kernel: CPU topo: Num. threads per package:   1
Jan 26 14:04:39 localhost kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Jan 26 14:04:39 localhost kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Jan 26 14:04:39 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Jan 26 14:04:39 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Jan 26 14:04:39 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Jan 26 14:04:39 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Jan 26 14:04:39 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Jan 26 14:04:39 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Jan 26 14:04:39 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Jan 26 14:04:39 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Jan 26 14:04:39 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Jan 26 14:04:39 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Jan 26 14:04:39 localhost kernel: Booting paravirtualized kernel on KVM
Jan 26 14:04:39 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Jan 26 14:04:39 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Jan 26 14:04:39 localhost kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Jan 26 14:04:39 localhost kernel: pcpu-alloc: s225280 r8192 d28672 u262144 alloc=1*2097152
Jan 26 14:04:39 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Jan 26 14:04:39 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Jan 26 14:04:39 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 26 14:04:39 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64", will be passed to user space.
Jan 26 14:04:39 localhost kernel: random: crng init done
Jan 26 14:04:39 localhost kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Jan 26 14:04:39 localhost kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Jan 26 14:04:39 localhost kernel: Fallback order for Node 0: 0 
Jan 26 14:04:39 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Jan 26 14:04:39 localhost kernel: Policy zone: Normal
Jan 26 14:04:39 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Jan 26 14:04:39 localhost kernel: software IO TLB: area num 8.
Jan 26 14:04:39 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Jan 26 14:04:39 localhost kernel: ftrace: allocating 49417 entries in 194 pages
Jan 26 14:04:39 localhost kernel: ftrace: allocated 194 pages with 3 groups
Jan 26 14:04:39 localhost kernel: Dynamic Preempt: voluntary
Jan 26 14:04:39 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Jan 26 14:04:39 localhost kernel: rcu:         RCU event tracing is enabled.
Jan 26 14:04:39 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Jan 26 14:04:39 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Jan 26 14:04:39 localhost kernel:         Rude variant of Tasks RCU enabled.
Jan 26 14:04:39 localhost kernel:         Tracing variant of Tasks RCU enabled.
Jan 26 14:04:39 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Jan 26 14:04:39 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Jan 26 14:04:39 localhost kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 26 14:04:39 localhost kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 26 14:04:39 localhost kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 26 14:04:39 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Jan 26 14:04:39 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Jan 26 14:04:39 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Jan 26 14:04:39 localhost kernel: Console: colour VGA+ 80x25
Jan 26 14:04:39 localhost kernel: printk: console [ttyS0] enabled
Jan 26 14:04:39 localhost kernel: ACPI: Core revision 20230331
Jan 26 14:04:39 localhost kernel: APIC: Switch to symmetric I/O mode setup
Jan 26 14:04:39 localhost kernel: x2apic enabled
Jan 26 14:04:39 localhost kernel: APIC: Switched APIC routing to: physical x2apic
Jan 26 14:04:39 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Jan 26 14:04:39 localhost kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998)
Jan 26 14:04:39 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Jan 26 14:04:39 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Jan 26 14:04:39 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Jan 26 14:04:39 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Jan 26 14:04:39 localhost kernel: Spectre V2 : Mitigation: Retpolines
Jan 26 14:04:39 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Jan 26 14:04:39 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Jan 26 14:04:39 localhost kernel: RETBleed: Mitigation: untrained return thunk
Jan 26 14:04:39 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Jan 26 14:04:39 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Jan 26 14:04:39 localhost kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Jan 26 14:04:39 localhost kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Jan 26 14:04:39 localhost kernel: x86/bugs: return thunk changed
Jan 26 14:04:39 localhost kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Jan 26 14:04:39 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Jan 26 14:04:39 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Jan 26 14:04:39 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Jan 26 14:04:39 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Jan 26 14:04:39 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Jan 26 14:04:39 localhost kernel: Freeing SMP alternatives memory: 40K
Jan 26 14:04:39 localhost kernel: pid_max: default: 32768 minimum: 301
Jan 26 14:04:39 localhost kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Jan 26 14:04:39 localhost kernel: landlock: Up and running.
Jan 26 14:04:39 localhost kernel: Yama: becoming mindful.
Jan 26 14:04:39 localhost kernel: SELinux:  Initializing.
Jan 26 14:04:39 localhost kernel: LSM support for eBPF active
Jan 26 14:04:39 localhost kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 26 14:04:39 localhost kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 26 14:04:39 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Jan 26 14:04:39 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Jan 26 14:04:39 localhost kernel: ... version:                0
Jan 26 14:04:39 localhost kernel: ... bit width:              48
Jan 26 14:04:39 localhost kernel: ... generic registers:      6
Jan 26 14:04:39 localhost kernel: ... value mask:             0000ffffffffffff
Jan 26 14:04:39 localhost kernel: ... max period:             00007fffffffffff
Jan 26 14:04:39 localhost kernel: ... fixed-purpose events:   0
Jan 26 14:04:39 localhost kernel: ... event mask:             000000000000003f
Jan 26 14:04:39 localhost kernel: signal: max sigframe size: 1776
Jan 26 14:04:39 localhost kernel: rcu: Hierarchical SRCU implementation.
Jan 26 14:04:39 localhost kernel: rcu:         Max phase no-delay instances is 400.
Jan 26 14:04:39 localhost kernel: smp: Bringing up secondary CPUs ...
Jan 26 14:04:39 localhost kernel: smpboot: x86: Booting SMP configuration:
Jan 26 14:04:39 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Jan 26 14:04:39 localhost kernel: smp: Brought up 1 node, 8 CPUs
Jan 26 14:04:39 localhost kernel: smpboot: Total of 8 processors activated (44799.96 BogoMIPS)
Jan 26 14:04:39 localhost kernel: node 0 deferred pages initialised in 13ms
Jan 26 14:04:39 localhost kernel: Memory: 7763820K/8388068K available (16384K kernel code, 5797K rwdata, 13916K rodata, 4200K init, 7192K bss, 618356K reserved, 0K cma-reserved)
Jan 26 14:04:39 localhost kernel: devtmpfs: initialized
Jan 26 14:04:39 localhost kernel: x86/mm: Memory block size: 128MB
Jan 26 14:04:39 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Jan 26 14:04:39 localhost kernel: futex hash table entries: 2048 (131072 bytes on 1 NUMA nodes, total 128 KiB, linear).
Jan 26 14:04:39 localhost kernel: pinctrl core: initialized pinctrl subsystem
Jan 26 14:04:39 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Jan 26 14:04:39 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Jan 26 14:04:39 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Jan 26 14:04:39 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Jan 26 14:04:39 localhost kernel: audit: initializing netlink subsys (disabled)
Jan 26 14:04:39 localhost kernel: audit: type=2000 audit(1769436277.627:1): state=initialized audit_enabled=0 res=1
Jan 26 14:04:39 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Jan 26 14:04:39 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Jan 26 14:04:39 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Jan 26 14:04:39 localhost kernel: cpuidle: using governor menu
Jan 26 14:04:39 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Jan 26 14:04:39 localhost kernel: PCI: Using configuration type 1 for base access
Jan 26 14:04:39 localhost kernel: PCI: Using configuration type 1 for extended access
Jan 26 14:04:39 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Jan 26 14:04:39 localhost kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Jan 26 14:04:39 localhost kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Jan 26 14:04:39 localhost kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Jan 26 14:04:39 localhost kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Jan 26 14:04:39 localhost kernel: Demotion targets for Node 0: null
Jan 26 14:04:39 localhost kernel: cryptd: max_cpu_qlen set to 1000
Jan 26 14:04:39 localhost kernel: ACPI: Added _OSI(Module Device)
Jan 26 14:04:39 localhost kernel: ACPI: Added _OSI(Processor Device)
Jan 26 14:04:39 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Jan 26 14:04:39 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Jan 26 14:04:39 localhost kernel: ACPI: Interpreter enabled
Jan 26 14:04:39 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Jan 26 14:04:39 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Jan 26 14:04:39 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Jan 26 14:04:39 localhost kernel: PCI: Using E820 reservations for host bridge windows
Jan 26 14:04:39 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Jan 26 14:04:39 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Jan 26 14:04:39 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Jan 26 14:04:39 localhost kernel: acpiphp: Slot [3] registered
Jan 26 14:04:39 localhost kernel: acpiphp: Slot [4] registered
Jan 26 14:04:39 localhost kernel: acpiphp: Slot [5] registered
Jan 26 14:04:39 localhost kernel: acpiphp: Slot [6] registered
Jan 26 14:04:39 localhost kernel: acpiphp: Slot [7] registered
Jan 26 14:04:39 localhost kernel: acpiphp: Slot [8] registered
Jan 26 14:04:39 localhost kernel: acpiphp: Slot [9] registered
Jan 26 14:04:39 localhost kernel: acpiphp: Slot [10] registered
Jan 26 14:04:39 localhost kernel: acpiphp: Slot [11] registered
Jan 26 14:04:39 localhost kernel: acpiphp: Slot [12] registered
Jan 26 14:04:39 localhost kernel: acpiphp: Slot [13] registered
Jan 26 14:04:39 localhost kernel: acpiphp: Slot [14] registered
Jan 26 14:04:39 localhost kernel: acpiphp: Slot [15] registered
Jan 26 14:04:39 localhost kernel: acpiphp: Slot [16] registered
Jan 26 14:04:39 localhost kernel: acpiphp: Slot [17] registered
Jan 26 14:04:39 localhost kernel: acpiphp: Slot [18] registered
Jan 26 14:04:39 localhost kernel: acpiphp: Slot [19] registered
Jan 26 14:04:39 localhost kernel: acpiphp: Slot [20] registered
Jan 26 14:04:39 localhost kernel: acpiphp: Slot [21] registered
Jan 26 14:04:39 localhost kernel: acpiphp: Slot [22] registered
Jan 26 14:04:39 localhost kernel: acpiphp: Slot [23] registered
Jan 26 14:04:39 localhost kernel: acpiphp: Slot [24] registered
Jan 26 14:04:39 localhost kernel: acpiphp: Slot [25] registered
Jan 26 14:04:39 localhost kernel: acpiphp: Slot [26] registered
Jan 26 14:04:39 localhost kernel: acpiphp: Slot [27] registered
Jan 26 14:04:39 localhost kernel: acpiphp: Slot [28] registered
Jan 26 14:04:39 localhost kernel: acpiphp: Slot [29] registered
Jan 26 14:04:39 localhost kernel: acpiphp: Slot [30] registered
Jan 26 14:04:39 localhost kernel: acpiphp: Slot [31] registered
Jan 26 14:04:39 localhost kernel: PCI host bridge to bus 0000:00
Jan 26 14:04:39 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Jan 26 14:04:39 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Jan 26 14:04:39 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Jan 26 14:04:39 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Jan 26 14:04:39 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Jan 26 14:04:39 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Jan 26 14:04:39 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Jan 26 14:04:39 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Jan 26 14:04:39 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Jan 26 14:04:39 localhost kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Jan 26 14:04:39 localhost kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Jan 26 14:04:39 localhost kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Jan 26 14:04:39 localhost kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Jan 26 14:04:39 localhost kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Jan 26 14:04:39 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Jan 26 14:04:39 localhost kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Jan 26 14:04:39 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Jan 26 14:04:39 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Jan 26 14:04:39 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Jan 26 14:04:39 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Jan 26 14:04:39 localhost kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Jan 26 14:04:39 localhost kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Jan 26 14:04:39 localhost kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Jan 26 14:04:39 localhost kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Jan 26 14:04:39 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Jan 26 14:04:39 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Jan 26 14:04:39 localhost kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Jan 26 14:04:39 localhost kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Jan 26 14:04:39 localhost kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Jan 26 14:04:39 localhost kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Jan 26 14:04:39 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Jan 26 14:04:39 localhost kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Jan 26 14:04:39 localhost kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Jan 26 14:04:39 localhost kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Jan 26 14:04:39 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Jan 26 14:04:39 localhost kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Jan 26 14:04:39 localhost kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Jan 26 14:04:39 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Jan 26 14:04:39 localhost kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Jan 26 14:04:39 localhost kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Jan 26 14:04:39 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Jan 26 14:04:39 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Jan 26 14:04:39 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Jan 26 14:04:39 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Jan 26 14:04:39 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Jan 26 14:04:39 localhost kernel: iommu: Default domain type: Translated
Jan 26 14:04:39 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Jan 26 14:04:39 localhost kernel: SCSI subsystem initialized
Jan 26 14:04:39 localhost kernel: ACPI: bus type USB registered
Jan 26 14:04:39 localhost kernel: usbcore: registered new interface driver usbfs
Jan 26 14:04:39 localhost kernel: usbcore: registered new interface driver hub
Jan 26 14:04:39 localhost kernel: usbcore: registered new device driver usb
Jan 26 14:04:39 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Jan 26 14:04:39 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Jan 26 14:04:39 localhost kernel: PTP clock support registered
Jan 26 14:04:39 localhost kernel: EDAC MC: Ver: 3.0.0
Jan 26 14:04:39 localhost kernel: NetLabel: Initializing
Jan 26 14:04:39 localhost kernel: NetLabel:  domain hash size = 128
Jan 26 14:04:39 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Jan 26 14:04:39 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Jan 26 14:04:39 localhost kernel: PCI: Using ACPI for IRQ routing
Jan 26 14:04:39 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Jan 26 14:04:39 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Jan 26 14:04:39 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Jan 26 14:04:39 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Jan 26 14:04:39 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Jan 26 14:04:39 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Jan 26 14:04:39 localhost kernel: vgaarb: loaded
Jan 26 14:04:39 localhost kernel: clocksource: Switched to clocksource kvm-clock
Jan 26 14:04:39 localhost kernel: VFS: Disk quotas dquot_6.6.0
Jan 26 14:04:39 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Jan 26 14:04:39 localhost kernel: pnp: PnP ACPI init
Jan 26 14:04:39 localhost kernel: pnp 00:03: [dma 2]
Jan 26 14:04:39 localhost kernel: pnp: PnP ACPI: found 5 devices
Jan 26 14:04:39 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Jan 26 14:04:39 localhost kernel: NET: Registered PF_INET protocol family
Jan 26 14:04:39 localhost kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Jan 26 14:04:39 localhost kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Jan 26 14:04:39 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Jan 26 14:04:39 localhost kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Jan 26 14:04:39 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Jan 26 14:04:39 localhost kernel: TCP: Hash tables configured (established 65536 bind 65536)
Jan 26 14:04:39 localhost kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Jan 26 14:04:39 localhost kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 26 14:04:39 localhost kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 26 14:04:39 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Jan 26 14:04:39 localhost kernel: NET: Registered PF_XDP protocol family
Jan 26 14:04:39 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Jan 26 14:04:39 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Jan 26 14:04:39 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Jan 26 14:04:39 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Jan 26 14:04:39 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Jan 26 14:04:39 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Jan 26 14:04:39 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Jan 26 14:04:39 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Jan 26 14:04:39 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 92330 usecs
Jan 26 14:04:39 localhost kernel: PCI: CLS 0 bytes, default 64
Jan 26 14:04:39 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Jan 26 14:04:39 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Jan 26 14:04:39 localhost kernel: ACPI: bus type thunderbolt registered
Jan 26 14:04:39 localhost kernel: Trying to unpack rootfs image as initramfs...
Jan 26 14:04:39 localhost kernel: Initialise system trusted keyrings
Jan 26 14:04:39 localhost kernel: Key type blacklist registered
Jan 26 14:04:39 localhost kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Jan 26 14:04:39 localhost kernel: zbud: loaded
Jan 26 14:04:39 localhost kernel: integrity: Platform Keyring initialized
Jan 26 14:04:39 localhost kernel: integrity: Machine keyring initialized
Jan 26 14:04:39 localhost kernel: Freeing initrd memory: 87956K
Jan 26 14:04:39 localhost kernel: NET: Registered PF_ALG protocol family
Jan 26 14:04:39 localhost kernel: xor: automatically using best checksumming function   avx       
Jan 26 14:04:39 localhost kernel: Key type asymmetric registered
Jan 26 14:04:39 localhost kernel: Asymmetric key parser 'x509' registered
Jan 26 14:04:39 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Jan 26 14:04:39 localhost kernel: io scheduler mq-deadline registered
Jan 26 14:04:39 localhost kernel: io scheduler kyber registered
Jan 26 14:04:39 localhost kernel: io scheduler bfq registered
Jan 26 14:04:39 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Jan 26 14:04:39 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Jan 26 14:04:39 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Jan 26 14:04:39 localhost kernel: ACPI: button: Power Button [PWRF]
Jan 26 14:04:39 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Jan 26 14:04:39 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Jan 26 14:04:39 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Jan 26 14:04:39 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Jan 26 14:04:39 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Jan 26 14:04:39 localhost kernel: Non-volatile memory driver v1.3
Jan 26 14:04:39 localhost kernel: rdac: device handler registered
Jan 26 14:04:39 localhost kernel: hp_sw: device handler registered
Jan 26 14:04:39 localhost kernel: emc: device handler registered
Jan 26 14:04:39 localhost kernel: alua: device handler registered
Jan 26 14:04:39 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Jan 26 14:04:39 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Jan 26 14:04:39 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Jan 26 14:04:39 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Jan 26 14:04:39 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Jan 26 14:04:39 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Jan 26 14:04:39 localhost kernel: usb usb1: Product: UHCI Host Controller
Jan 26 14:04:39 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-661.el9.x86_64 uhci_hcd
Jan 26 14:04:39 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Jan 26 14:04:39 localhost kernel: hub 1-0:1.0: USB hub found
Jan 26 14:04:39 localhost kernel: hub 1-0:1.0: 2 ports detected
Jan 26 14:04:39 localhost kernel: usbcore: registered new interface driver usbserial_generic
Jan 26 14:04:39 localhost kernel: usbserial: USB Serial support registered for generic
Jan 26 14:04:39 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Jan 26 14:04:39 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Jan 26 14:04:39 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Jan 26 14:04:39 localhost kernel: mousedev: PS/2 mouse device common for all mice
Jan 26 14:04:39 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Jan 26 14:04:39 localhost kernel: rtc_cmos 00:04: registered as rtc0
Jan 26 14:04:39 localhost kernel: rtc_cmos 00:04: setting system clock to 2026-01-26T14:04:38 UTC (1769436278)
Jan 26 14:04:39 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Jan 26 14:04:39 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Jan 26 14:04:39 localhost kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Jan 26 14:04:39 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Jan 26 14:04:39 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Jan 26 14:04:39 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Jan 26 14:04:39 localhost kernel: usbcore: registered new interface driver usbhid
Jan 26 14:04:39 localhost kernel: usbhid: USB HID core driver
Jan 26 14:04:39 localhost kernel: drop_monitor: Initializing network drop monitor service
Jan 26 14:04:39 localhost kernel: Initializing XFRM netlink socket
Jan 26 14:04:39 localhost kernel: NET: Registered PF_INET6 protocol family
Jan 26 14:04:39 localhost kernel: Segment Routing with IPv6
Jan 26 14:04:39 localhost kernel: NET: Registered PF_PACKET protocol family
Jan 26 14:04:39 localhost kernel: mpls_gso: MPLS GSO support
Jan 26 14:04:39 localhost kernel: IPI shorthand broadcast: enabled
Jan 26 14:04:39 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Jan 26 14:04:39 localhost kernel: AES CTR mode by8 optimization enabled
Jan 26 14:04:39 localhost kernel: sched_clock: Marking stable (1221013146, 148214696)->(1475066844, -105839002)
Jan 26 14:04:39 localhost kernel: registered taskstats version 1
Jan 26 14:04:39 localhost kernel: Loading compiled-in X.509 certificates
Jan 26 14:04:39 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 04453f216699002fd63185eeab832de990bee6d7'
Jan 26 14:04:39 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Jan 26 14:04:39 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Jan 26 14:04:39 localhost kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Jan 26 14:04:39 localhost kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Jan 26 14:04:39 localhost kernel: Demotion targets for Node 0: null
Jan 26 14:04:39 localhost kernel: page_owner is disabled
Jan 26 14:04:39 localhost kernel: Key type .fscrypt registered
Jan 26 14:04:39 localhost kernel: Key type fscrypt-provisioning registered
Jan 26 14:04:39 localhost kernel: Key type big_key registered
Jan 26 14:04:39 localhost kernel: Key type encrypted registered
Jan 26 14:04:39 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Jan 26 14:04:39 localhost kernel: Loading compiled-in module X.509 certificates
Jan 26 14:04:39 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 04453f216699002fd63185eeab832de990bee6d7'
Jan 26 14:04:39 localhost kernel: ima: Allocated hash algorithm: sha256
Jan 26 14:04:39 localhost kernel: ima: No architecture policies found
Jan 26 14:04:39 localhost kernel: evm: Initialising EVM extended attributes:
Jan 26 14:04:39 localhost kernel: evm: security.selinux
Jan 26 14:04:39 localhost kernel: evm: security.SMACK64 (disabled)
Jan 26 14:04:39 localhost kernel: evm: security.SMACK64EXEC (disabled)
Jan 26 14:04:39 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Jan 26 14:04:39 localhost kernel: evm: security.SMACK64MMAP (disabled)
Jan 26 14:04:39 localhost kernel: evm: security.apparmor (disabled)
Jan 26 14:04:39 localhost kernel: evm: security.ima
Jan 26 14:04:39 localhost kernel: evm: security.capability
Jan 26 14:04:39 localhost kernel: evm: HMAC attrs: 0x1
Jan 26 14:04:39 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Jan 26 14:04:39 localhost kernel: Running certificate verification RSA selftest
Jan 26 14:04:39 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Jan 26 14:04:39 localhost kernel: Running certificate verification ECDSA selftest
Jan 26 14:04:39 localhost kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Jan 26 14:04:39 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Jan 26 14:04:39 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Jan 26 14:04:39 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Jan 26 14:04:39 localhost kernel: usb 1-1: Manufacturer: QEMU
Jan 26 14:04:39 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Jan 26 14:04:39 localhost kernel: clk: Disabling unused clocks
Jan 26 14:04:39 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Jan 26 14:04:39 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Jan 26 14:04:39 localhost kernel: Freeing unused decrypted memory: 2028K
Jan 26 14:04:39 localhost kernel: Freeing unused kernel image (initmem) memory: 4200K
Jan 26 14:04:39 localhost kernel: Write protecting the kernel read-only data: 30720k
Jan 26 14:04:39 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 420K
Jan 26 14:04:39 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Jan 26 14:04:39 localhost kernel: Run /init as init process
Jan 26 14:04:39 localhost kernel:   with arguments:
Jan 26 14:04:39 localhost kernel:     /init
Jan 26 14:04:39 localhost kernel:   with environment:
Jan 26 14:04:39 localhost kernel:     HOME=/
Jan 26 14:04:39 localhost kernel:     TERM=linux
Jan 26 14:04:39 localhost kernel:     BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64
Jan 26 14:04:39 localhost systemd[1]: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 26 14:04:39 localhost systemd[1]: Detected virtualization kvm.
Jan 26 14:04:39 localhost systemd[1]: Detected architecture x86-64.
Jan 26 14:04:39 localhost systemd[1]: Running in initrd.
Jan 26 14:04:39 localhost systemd[1]: No hostname configured, using default hostname.
Jan 26 14:04:39 localhost systemd[1]: Hostname set to <localhost>.
Jan 26 14:04:39 localhost systemd[1]: Initializing machine ID from VM UUID.
Jan 26 14:04:39 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Jan 26 14:04:39 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Jan 26 14:04:39 localhost systemd[1]: Reached target Local Encrypted Volumes.
Jan 26 14:04:39 localhost systemd[1]: Reached target Initrd /usr File System.
Jan 26 14:04:39 localhost systemd[1]: Reached target Local File Systems.
Jan 26 14:04:39 localhost systemd[1]: Reached target Path Units.
Jan 26 14:04:39 localhost systemd[1]: Reached target Slice Units.
Jan 26 14:04:39 localhost systemd[1]: Reached target Swaps.
Jan 26 14:04:39 localhost systemd[1]: Reached target Timer Units.
Jan 26 14:04:39 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Jan 26 14:04:39 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Jan 26 14:04:39 localhost systemd[1]: Listening on Journal Socket.
Jan 26 14:04:39 localhost systemd[1]: Listening on udev Control Socket.
Jan 26 14:04:39 localhost systemd[1]: Listening on udev Kernel Socket.
Jan 26 14:04:39 localhost systemd[1]: Reached target Socket Units.
Jan 26 14:04:39 localhost systemd[1]: Starting Create List of Static Device Nodes...
Jan 26 14:04:39 localhost systemd[1]: Starting Journal Service...
Jan 26 14:04:39 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 26 14:04:39 localhost systemd[1]: Starting Apply Kernel Variables...
Jan 26 14:04:39 localhost systemd[1]: Starting Create System Users...
Jan 26 14:04:39 localhost systemd[1]: Starting Setup Virtual Console...
Jan 26 14:04:39 localhost systemd[1]: Finished Create List of Static Device Nodes.
Jan 26 14:04:39 localhost systemd-journald[305]: Journal started
Jan 26 14:04:39 localhost systemd-journald[305]: Runtime Journal (/run/log/journal/79d25091300b4c01ad9685507a639987) is 8.0M, max 153.6M, 145.6M free.
Jan 26 14:04:39 localhost systemd[1]: Started Journal Service.
Jan 26 14:04:39 localhost systemd[1]: Finished Apply Kernel Variables.
Jan 26 14:04:39 localhost systemd-sysusers[310]: Creating group 'users' with GID 100.
Jan 26 14:04:39 localhost systemd-sysusers[310]: Creating group 'dbus' with GID 81.
Jan 26 14:04:39 localhost systemd-sysusers[310]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Jan 26 14:04:39 localhost systemd[1]: Finished Create System Users.
Jan 26 14:04:39 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Jan 26 14:04:39 localhost systemd[1]: Starting Create Volatile Files and Directories...
Jan 26 14:04:39 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 26 14:04:39 localhost systemd[1]: Finished Create Volatile Files and Directories.
Jan 26 14:04:39 localhost systemd[1]: Finished Setup Virtual Console.
Jan 26 14:04:39 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Jan 26 14:04:39 localhost systemd[1]: Starting dracut cmdline hook...
Jan 26 14:04:39 localhost dracut-cmdline[324]: dracut-9 dracut-057-102.git20250818.el9
Jan 26 14:04:39 localhost dracut-cmdline[324]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 26 14:04:39 localhost systemd[1]: Finished dracut cmdline hook.
Jan 26 14:04:39 localhost systemd[1]: Starting dracut pre-udev hook...
Jan 26 14:04:39 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Jan 26 14:04:39 localhost kernel: device-mapper: uevent: version 1.0.3
Jan 26 14:04:39 localhost kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Jan 26 14:04:39 localhost kernel: RPC: Registered named UNIX socket transport module.
Jan 26 14:04:39 localhost kernel: RPC: Registered udp transport module.
Jan 26 14:04:39 localhost kernel: RPC: Registered tcp transport module.
Jan 26 14:04:39 localhost kernel: RPC: Registered tcp-with-tls transport module.
Jan 26 14:04:39 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Jan 26 14:04:40 localhost rpc.statd[442]: Version 2.5.4 starting
Jan 26 14:04:40 localhost rpc.statd[442]: Initializing NSM state
Jan 26 14:04:40 localhost rpc.idmapd[447]: Setting log level to 0
Jan 26 14:04:40 localhost systemd[1]: Finished dracut pre-udev hook.
Jan 26 14:04:40 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 26 14:04:40 localhost systemd-udevd[460]: Using default interface naming scheme 'rhel-9.0'.
Jan 26 14:04:40 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 26 14:04:40 localhost systemd[1]: Starting dracut pre-trigger hook...
Jan 26 14:04:40 localhost systemd[1]: Finished dracut pre-trigger hook.
Jan 26 14:04:40 localhost systemd[1]: Starting Coldplug All udev Devices...
Jan 26 14:04:40 localhost systemd[1]: Created slice Slice /system/modprobe.
Jan 26 14:04:40 localhost systemd[1]: Starting Load Kernel Module configfs...
Jan 26 14:04:40 localhost systemd[1]: Finished Coldplug All udev Devices.
Jan 26 14:04:40 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 26 14:04:40 localhost systemd[1]: Finished Load Kernel Module configfs.
Jan 26 14:04:40 localhost systemd[1]: Mounting Kernel Configuration File System...
Jan 26 14:04:40 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 26 14:04:40 localhost systemd[1]: Reached target Network.
Jan 26 14:04:40 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 26 14:04:40 localhost systemd[1]: Starting dracut initqueue hook...
Jan 26 14:04:40 localhost systemd[1]: Mounted Kernel Configuration File System.
Jan 26 14:04:40 localhost systemd[1]: Reached target System Initialization.
Jan 26 14:04:40 localhost systemd[1]: Reached target Basic System.
Jan 26 14:04:40 localhost kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Jan 26 14:04:40 localhost kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Jan 26 14:04:40 localhost kernel:  vda: vda1
Jan 26 14:04:40 localhost kernel: libata version 3.00 loaded.
Jan 26 14:04:40 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Jan 26 14:04:40 localhost kernel: scsi host0: ata_piix
Jan 26 14:04:40 localhost kernel: scsi host1: ata_piix
Jan 26 14:04:40 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Jan 26 14:04:40 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Jan 26 14:04:40 localhost systemd-udevd[490]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 14:04:40 localhost systemd[1]: Found device /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40.
Jan 26 14:04:40 localhost systemd[1]: Reached target Initrd Root Device.
Jan 26 14:04:40 localhost kernel: ata1: found unknown device (class 0)
Jan 26 14:04:40 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Jan 26 14:04:40 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Jan 26 14:04:40 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Jan 26 14:04:40 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Jan 26 14:04:40 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Jan 26 14:04:40 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Jan 26 14:04:40 localhost systemd[1]: Finished dracut initqueue hook.
Jan 26 14:04:40 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Jan 26 14:04:40 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Jan 26 14:04:40 localhost systemd[1]: Reached target Remote File Systems.
Jan 26 14:04:40 localhost systemd[1]: Starting dracut pre-mount hook...
Jan 26 14:04:40 localhost systemd[1]: Finished dracut pre-mount hook.
Jan 26 14:04:41 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40...
Jan 26 14:04:41 localhost systemd-fsck[556]: /usr/sbin/fsck.xfs: XFS file system.
Jan 26 14:04:41 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40.
Jan 26 14:04:41 localhost systemd[1]: Mounting /sysroot...
Jan 26 14:04:41 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Jan 26 14:04:41 localhost kernel: XFS (vda1): Mounting V5 Filesystem 22ac9141-3960-4912-b20e-19fc8a328d40
Jan 26 14:04:41 localhost kernel: XFS (vda1): Ending clean mount
Jan 26 14:04:41 localhost systemd[1]: Mounted /sysroot.
Jan 26 14:04:41 localhost systemd[1]: Reached target Initrd Root File System.
Jan 26 14:04:41 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Jan 26 14:04:41 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Jan 26 14:04:41 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Jan 26 14:04:41 localhost systemd[1]: Reached target Initrd File Systems.
Jan 26 14:04:41 localhost systemd[1]: Reached target Initrd Default Target.
Jan 26 14:04:41 localhost systemd[1]: Starting dracut mount hook...
Jan 26 14:04:41 localhost systemd[1]: Finished dracut mount hook.
Jan 26 14:04:41 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Jan 26 14:04:41 localhost rpc.idmapd[447]: exiting on signal 15
Jan 26 14:04:41 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Jan 26 14:04:41 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Jan 26 14:04:41 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Jan 26 14:04:41 localhost systemd[1]: Stopped target Network.
Jan 26 14:04:41 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Jan 26 14:04:41 localhost systemd[1]: Stopped target Timer Units.
Jan 26 14:04:41 localhost systemd[1]: dbus.socket: Deactivated successfully.
Jan 26 14:04:41 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Jan 26 14:04:41 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Jan 26 14:04:41 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Jan 26 14:04:41 localhost systemd[1]: Stopped target Initrd Default Target.
Jan 26 14:04:41 localhost systemd[1]: Stopped target Basic System.
Jan 26 14:04:41 localhost systemd[1]: Stopped target Initrd Root Device.
Jan 26 14:04:41 localhost systemd[1]: Stopped target Initrd /usr File System.
Jan 26 14:04:41 localhost systemd[1]: Stopped target Path Units.
Jan 26 14:04:41 localhost systemd[1]: Stopped target Remote File Systems.
Jan 26 14:04:41 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Jan 26 14:04:41 localhost systemd[1]: Stopped target Slice Units.
Jan 26 14:04:41 localhost systemd[1]: Stopped target Socket Units.
Jan 26 14:04:41 localhost systemd[1]: Stopped target System Initialization.
Jan 26 14:04:41 localhost systemd[1]: Stopped target Local File Systems.
Jan 26 14:04:41 localhost systemd[1]: Stopped target Swaps.
Jan 26 14:04:41 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Jan 26 14:04:41 localhost systemd[1]: Stopped dracut mount hook.
Jan 26 14:04:41 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Jan 26 14:04:41 localhost systemd[1]: Stopped dracut pre-mount hook.
Jan 26 14:04:41 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Jan 26 14:04:41 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Jan 26 14:04:41 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Jan 26 14:04:41 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Jan 26 14:04:41 localhost systemd[1]: Stopped dracut initqueue hook.
Jan 26 14:04:41 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 26 14:04:41 localhost systemd[1]: Stopped Apply Kernel Variables.
Jan 26 14:04:41 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Jan 26 14:04:41 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Jan 26 14:04:41 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Jan 26 14:04:41 localhost systemd[1]: Stopped Coldplug All udev Devices.
Jan 26 14:04:41 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Jan 26 14:04:41 localhost systemd[1]: Stopped dracut pre-trigger hook.
Jan 26 14:04:41 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Jan 26 14:04:41 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Jan 26 14:04:41 localhost systemd[1]: Stopped Setup Virtual Console.
Jan 26 14:04:41 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Jan 26 14:04:41 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Jan 26 14:04:41 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Jan 26 14:04:41 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Jan 26 14:04:41 localhost systemd[1]: systemd-udevd.service: Consumed 1.166s CPU time.
Jan 26 14:04:41 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Jan 26 14:04:41 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Jan 26 14:04:41 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Jan 26 14:04:41 localhost systemd[1]: Closed udev Control Socket.
Jan 26 14:04:41 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Jan 26 14:04:41 localhost systemd[1]: Closed udev Kernel Socket.
Jan 26 14:04:41 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Jan 26 14:04:41 localhost systemd[1]: Stopped dracut pre-udev hook.
Jan 26 14:04:41 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Jan 26 14:04:41 localhost systemd[1]: Stopped dracut cmdline hook.
Jan 26 14:04:41 localhost systemd[1]: Starting Cleanup udev Database...
Jan 26 14:04:41 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Jan 26 14:04:41 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Jan 26 14:04:41 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Jan 26 14:04:41 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Jan 26 14:04:41 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Jan 26 14:04:41 localhost systemd[1]: Stopped Create System Users.
Jan 26 14:04:41 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Jan 26 14:04:41 localhost systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Jan 26 14:04:41 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Jan 26 14:04:41 localhost systemd[1]: Finished Cleanup udev Database.
Jan 26 14:04:41 localhost systemd[1]: Reached target Switch Root.
Jan 26 14:04:41 localhost systemd[1]: Starting Switch Root...
Jan 26 14:04:41 localhost systemd[1]: Switching root.
Jan 26 14:04:41 localhost systemd-journald[305]: Journal stopped
Jan 26 14:04:42 localhost systemd-journald[305]: Received SIGTERM from PID 1 (systemd).
Jan 26 14:04:42 localhost kernel: audit: type=1404 audit(1769436282.018:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Jan 26 14:04:42 localhost kernel: SELinux:  policy capability network_peer_controls=1
Jan 26 14:04:42 localhost kernel: SELinux:  policy capability open_perms=1
Jan 26 14:04:42 localhost kernel: SELinux:  policy capability extended_socket_class=1
Jan 26 14:04:42 localhost kernel: SELinux:  policy capability always_check_network=0
Jan 26 14:04:42 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 26 14:04:42 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 26 14:04:42 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 26 14:04:42 localhost kernel: audit: type=1403 audit(1769436282.145:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Jan 26 14:04:42 localhost systemd[1]: Successfully loaded SELinux policy in 129.739ms.
Jan 26 14:04:42 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 27.535ms.
Jan 26 14:04:42 localhost systemd[1]: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 26 14:04:42 localhost systemd[1]: Detected virtualization kvm.
Jan 26 14:04:42 localhost systemd[1]: Detected architecture x86-64.
Jan 26 14:04:42 localhost systemd-rc-local-generator[638]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:04:42 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Jan 26 14:04:42 localhost systemd[1]: Stopped Switch Root.
Jan 26 14:04:42 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Jan 26 14:04:42 localhost systemd[1]: Created slice Slice /system/getty.
Jan 26 14:04:42 localhost systemd[1]: Created slice Slice /system/serial-getty.
Jan 26 14:04:42 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Jan 26 14:04:42 localhost systemd[1]: Created slice User and Session Slice.
Jan 26 14:04:42 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Jan 26 14:04:42 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Jan 26 14:04:42 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Jan 26 14:04:42 localhost systemd[1]: Reached target Local Encrypted Volumes.
Jan 26 14:04:42 localhost systemd[1]: Stopped target Switch Root.
Jan 26 14:04:42 localhost systemd[1]: Stopped target Initrd File Systems.
Jan 26 14:04:42 localhost systemd[1]: Stopped target Initrd Root File System.
Jan 26 14:04:42 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Jan 26 14:04:42 localhost systemd[1]: Reached target Path Units.
Jan 26 14:04:42 localhost systemd[1]: Reached target rpc_pipefs.target.
Jan 26 14:04:42 localhost systemd[1]: Reached target Slice Units.
Jan 26 14:04:42 localhost systemd[1]: Reached target Swaps.
Jan 26 14:04:42 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Jan 26 14:04:42 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Jan 26 14:04:42 localhost systemd[1]: Reached target RPC Port Mapper.
Jan 26 14:04:42 localhost systemd[1]: Listening on Process Core Dump Socket.
Jan 26 14:04:42 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Jan 26 14:04:42 localhost systemd[1]: Listening on udev Control Socket.
Jan 26 14:04:42 localhost systemd[1]: Listening on udev Kernel Socket.
Jan 26 14:04:42 localhost systemd[1]: Mounting Huge Pages File System...
Jan 26 14:04:42 localhost systemd[1]: Mounting POSIX Message Queue File System...
Jan 26 14:04:42 localhost systemd[1]: Mounting Kernel Debug File System...
Jan 26 14:04:42 localhost systemd[1]: Mounting Kernel Trace File System...
Jan 26 14:04:42 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 26 14:04:42 localhost systemd[1]: Starting Create List of Static Device Nodes...
Jan 26 14:04:42 localhost systemd[1]: Starting Load Kernel Module configfs...
Jan 26 14:04:42 localhost systemd[1]: Starting Load Kernel Module drm...
Jan 26 14:04:42 localhost systemd[1]: Starting Load Kernel Module efi_pstore...
Jan 26 14:04:42 localhost systemd[1]: Starting Load Kernel Module fuse...
Jan 26 14:04:42 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Jan 26 14:04:42 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Jan 26 14:04:42 localhost systemd[1]: Stopped File System Check on Root Device.
Jan 26 14:04:42 localhost systemd[1]: Stopped Journal Service.
Jan 26 14:04:42 localhost systemd[1]: Starting Journal Service...
Jan 26 14:04:42 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 26 14:04:42 localhost systemd[1]: Starting Generate network units from Kernel command line...
Jan 26 14:04:42 localhost systemd[1]: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 26 14:04:42 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Jan 26 14:04:42 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Jan 26 14:04:42 localhost systemd[1]: Starting Apply Kernel Variables...
Jan 26 14:04:42 localhost systemd[1]: Starting Coldplug All udev Devices...
Jan 26 14:04:42 localhost kernel: fuse: init (API version 7.37)
Jan 26 14:04:42 localhost systemd[1]: Mounted Huge Pages File System.
Jan 26 14:04:42 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Jan 26 14:04:42 localhost systemd-journald[679]: Journal started
Jan 26 14:04:42 localhost systemd-journald[679]: Runtime Journal (/run/log/journal/85ac68c10a6e7ae08ceb898dbdca0cb5) is 8.0M, max 153.6M, 145.6M free.
Jan 26 14:04:42 localhost systemd[1]: Queued start job for default target Multi-User System.
Jan 26 14:04:42 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Jan 26 14:04:42 localhost systemd[1]: Started Journal Service.
Jan 26 14:04:42 localhost systemd[1]: Mounted POSIX Message Queue File System.
Jan 26 14:04:42 localhost systemd[1]: Mounted Kernel Debug File System.
Jan 26 14:04:42 localhost systemd[1]: Mounted Kernel Trace File System.
Jan 26 14:04:42 localhost systemd[1]: Finished Create List of Static Device Nodes.
Jan 26 14:04:42 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 26 14:04:42 localhost systemd[1]: Finished Load Kernel Module configfs.
Jan 26 14:04:42 localhost systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Jan 26 14:04:42 localhost systemd[1]: Finished Load Kernel Module efi_pstore.
Jan 26 14:04:42 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Jan 26 14:04:42 localhost systemd[1]: Finished Load Kernel Module fuse.
Jan 26 14:04:42 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Jan 26 14:04:42 localhost systemd[1]: Finished Generate network units from Kernel command line.
Jan 26 14:04:42 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Jan 26 14:04:42 localhost systemd[1]: Finished Apply Kernel Variables.
Jan 26 14:04:42 localhost kernel: ACPI: bus type drm_connector registered
Jan 26 14:04:42 localhost systemd[1]: Mounting FUSE Control File System...
Jan 26 14:04:42 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 26 14:04:42 localhost systemd[1]: Starting Rebuild Hardware Database...
Jan 26 14:04:42 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Jan 26 14:04:42 localhost systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Jan 26 14:04:42 localhost systemd[1]: Starting Load/Save OS Random Seed...
Jan 26 14:04:42 localhost systemd[1]: Starting Create System Users...
Jan 26 14:04:42 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Jan 26 14:04:42 localhost systemd[1]: Finished Load Kernel Module drm.
Jan 26 14:04:42 localhost systemd-journald[679]: Runtime Journal (/run/log/journal/85ac68c10a6e7ae08ceb898dbdca0cb5) is 8.0M, max 153.6M, 145.6M free.
Jan 26 14:04:42 localhost systemd-journald[679]: Received client request to flush runtime journal.
Jan 26 14:04:42 localhost systemd[1]: Mounted FUSE Control File System.
Jan 26 14:04:42 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Jan 26 14:04:42 localhost systemd[1]: Finished Coldplug All udev Devices.
Jan 26 14:04:42 localhost systemd[1]: Finished Load/Save OS Random Seed.
Jan 26 14:04:42 localhost systemd[1]: Finished Create System Users.
Jan 26 14:04:42 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 26 14:04:42 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Jan 26 14:04:42 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 26 14:04:42 localhost systemd[1]: Reached target Preparation for Local File Systems.
Jan 26 14:04:42 localhost systemd[1]: Reached target Local File Systems.
Jan 26 14:04:43 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Jan 26 14:04:43 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Jan 26 14:04:43 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Jan 26 14:04:43 localhost systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Jan 26 14:04:43 localhost systemd[1]: Starting Automatic Boot Loader Update...
Jan 26 14:04:43 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Jan 26 14:04:43 localhost systemd[1]: Starting Create Volatile Files and Directories...
Jan 26 14:04:43 localhost bootctl[699]: Couldn't find EFI system partition, skipping.
Jan 26 14:04:43 localhost systemd[1]: Finished Automatic Boot Loader Update.
Jan 26 14:04:43 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Jan 26 14:04:43 localhost systemd[1]: Finished Create Volatile Files and Directories.
Jan 26 14:04:43 localhost systemd[1]: Starting Security Auditing Service...
Jan 26 14:04:43 localhost systemd[1]: Starting RPC Bind...
Jan 26 14:04:43 localhost systemd[1]: Starting Rebuild Journal Catalog...
Jan 26 14:04:43 localhost auditd[706]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Jan 26 14:04:43 localhost auditd[706]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Jan 26 14:04:43 localhost systemd[1]: Started RPC Bind.
Jan 26 14:04:43 localhost systemd[1]: Finished Rebuild Journal Catalog.
Jan 26 14:04:43 localhost augenrules[711]: /sbin/augenrules: No change
Jan 26 14:04:43 localhost augenrules[726]: No rules
Jan 26 14:04:43 localhost augenrules[726]: enabled 1
Jan 26 14:04:43 localhost augenrules[726]: failure 1
Jan 26 14:04:43 localhost augenrules[726]: pid 706
Jan 26 14:04:43 localhost augenrules[726]: rate_limit 0
Jan 26 14:04:43 localhost augenrules[726]: backlog_limit 8192
Jan 26 14:04:43 localhost augenrules[726]: lost 0
Jan 26 14:04:43 localhost augenrules[726]: backlog 2
Jan 26 14:04:43 localhost augenrules[726]: backlog_wait_time 60000
Jan 26 14:04:43 localhost augenrules[726]: backlog_wait_time_actual 0
Jan 26 14:04:43 localhost augenrules[726]: enabled 1
Jan 26 14:04:43 localhost augenrules[726]: failure 1
Jan 26 14:04:43 localhost augenrules[726]: pid 706
Jan 26 14:04:43 localhost augenrules[726]: rate_limit 0
Jan 26 14:04:43 localhost augenrules[726]: backlog_limit 8192
Jan 26 14:04:43 localhost augenrules[726]: lost 0
Jan 26 14:04:43 localhost augenrules[726]: backlog 0
Jan 26 14:04:43 localhost augenrules[726]: backlog_wait_time 60000
Jan 26 14:04:43 localhost augenrules[726]: backlog_wait_time_actual 0
Jan 26 14:04:43 localhost augenrules[726]: enabled 1
Jan 26 14:04:43 localhost augenrules[726]: failure 1
Jan 26 14:04:43 localhost augenrules[726]: pid 706
Jan 26 14:04:43 localhost augenrules[726]: rate_limit 0
Jan 26 14:04:43 localhost augenrules[726]: backlog_limit 8192
Jan 26 14:04:43 localhost augenrules[726]: lost 0
Jan 26 14:04:43 localhost augenrules[726]: backlog 0
Jan 26 14:04:43 localhost augenrules[726]: backlog_wait_time 60000
Jan 26 14:04:43 localhost augenrules[726]: backlog_wait_time_actual 0
Jan 26 14:04:43 localhost systemd[1]: Started Security Auditing Service.
Jan 26 14:04:43 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Jan 26 14:04:43 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Jan 26 14:04:43 localhost systemd[1]: Finished Rebuild Hardware Database.
Jan 26 14:04:43 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 26 14:04:43 localhost systemd[1]: Starting Update is Completed...
Jan 26 14:04:43 localhost systemd[1]: Finished Update is Completed.
Jan 26 14:04:43 localhost systemd-udevd[734]: Using default interface naming scheme 'rhel-9.0'.
Jan 26 14:04:43 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 26 14:04:43 localhost systemd[1]: Reached target System Initialization.
Jan 26 14:04:43 localhost systemd[1]: Started dnf makecache --timer.
Jan 26 14:04:43 localhost systemd[1]: Started Daily rotation of log files.
Jan 26 14:04:43 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Jan 26 14:04:43 localhost systemd[1]: Reached target Timer Units.
Jan 26 14:04:43 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Jan 26 14:04:43 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Jan 26 14:04:43 localhost systemd[1]: Reached target Socket Units.
Jan 26 14:04:43 localhost systemd[1]: Starting D-Bus System Message Bus...
Jan 26 14:04:43 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 26 14:04:43 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Jan 26 14:04:43 localhost systemd[1]: Starting Load Kernel Module configfs...
Jan 26 14:04:43 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 26 14:04:43 localhost systemd[1]: Finished Load Kernel Module configfs.
Jan 26 14:04:43 localhost systemd-udevd[739]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 14:04:43 localhost systemd[1]: Started D-Bus System Message Bus.
Jan 26 14:04:43 localhost systemd[1]: Reached target Basic System.
Jan 26 14:04:43 localhost dbus-broker-lau[756]: Ready
Jan 26 14:04:43 localhost systemd[1]: Starting NTP client/server...
Jan 26 14:04:43 localhost systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Jan 26 14:04:43 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Jan 26 14:04:43 localhost systemd[1]: Starting IPv4 firewall with iptables...
Jan 26 14:04:43 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Jan 26 14:04:43 localhost systemd[1]: Started irqbalance daemon.
Jan 26 14:04:43 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Jan 26 14:04:43 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 26 14:04:43 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 26 14:04:43 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 26 14:04:43 localhost systemd[1]: Reached target sshd-keygen.target.
Jan 26 14:04:43 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Jan 26 14:04:43 localhost systemd[1]: Reached target User and Group Name Lookups.
Jan 26 14:04:43 localhost chronyd[792]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 26 14:04:43 localhost chronyd[792]: Loaded 0 symmetric keys
Jan 26 14:04:43 localhost chronyd[792]: Using right/UTC timezone to obtain leap second data
Jan 26 14:04:43 localhost chronyd[792]: Loaded seccomp filter (level 2)
Jan 26 14:04:43 localhost systemd[1]: Starting User Login Management...
Jan 26 14:04:43 localhost systemd[1]: Started NTP client/server.
Jan 26 14:04:43 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Jan 26 14:04:43 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Jan 26 14:04:43 localhost kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Jan 26 14:04:43 localhost kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Jan 26 14:04:43 localhost systemd-logind[795]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 26 14:04:43 localhost systemd-logind[795]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 26 14:04:43 localhost systemd-logind[795]: New seat seat0.
Jan 26 14:04:43 localhost systemd[1]: Started User Login Management.
Jan 26 14:04:43 localhost kernel: kvm_amd: TSC scaling supported
Jan 26 14:04:43 localhost kernel: kvm_amd: Nested Virtualization enabled
Jan 26 14:04:43 localhost kernel: kvm_amd: Nested Paging enabled
Jan 26 14:04:43 localhost kernel: kvm_amd: LBR virtualization supported
Jan 26 14:04:43 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Jan 26 14:04:43 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Jan 26 14:04:43 localhost kernel: Console: switching to colour dummy device 80x25
Jan 26 14:04:43 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Jan 26 14:04:43 localhost kernel: [drm] features: -context_init
Jan 26 14:04:43 localhost kernel: [drm] number of scanouts: 1
Jan 26 14:04:43 localhost kernel: [drm] number of cap sets: 0
Jan 26 14:04:43 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Jan 26 14:04:43 localhost kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Jan 26 14:04:43 localhost kernel: Console: switching to colour frame buffer device 128x48
Jan 26 14:04:43 localhost kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Jan 26 14:04:43 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Jan 26 14:04:43 localhost kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Jan 26 14:04:44 localhost iptables.init[783]: iptables: Applying firewall rules: [  OK  ]
Jan 26 14:04:44 localhost systemd[1]: Finished IPv4 firewall with iptables.
Jan 26 14:04:44 localhost cloud-init[842]: Cloud-init v. 24.4-8.el9 running 'init-local' at Mon, 26 Jan 2026 14:04:44 +0000. Up 6.71 seconds.
Jan 26 14:04:44 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Jan 26 14:04:44 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Jan 26 14:04:44 localhost systemd[1]: run-cloud\x2dinit-tmp-tmpygoshleb.mount: Deactivated successfully.
Jan 26 14:04:44 localhost systemd[1]: Starting Hostname Service...
Jan 26 14:04:44 localhost systemd[1]: Started Hostname Service.
Jan 26 14:04:44 np0005595786.novalocal systemd-hostnamed[856]: Hostname set to <np0005595786.novalocal> (static)
Jan 26 14:04:44 np0005595786.novalocal systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Jan 26 14:04:44 np0005595786.novalocal systemd[1]: Reached target Preparation for Network.
Jan 26 14:04:44 np0005595786.novalocal systemd[1]: Starting Network Manager...
Jan 26 14:04:44 np0005595786.novalocal NetworkManager[860]: <info>  [1769436284.6280] NetworkManager (version 1.54.3-2.el9) is starting... (boot:ac2670c8-0eba-49ef-ba2e-d02b046debf0)
Jan 26 14:04:44 np0005595786.novalocal NetworkManager[860]: <info>  [1769436284.6286] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 26 14:04:44 np0005595786.novalocal NetworkManager[860]: <info>  [1769436284.6392] manager[0x5565aed2c000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 26 14:04:44 np0005595786.novalocal NetworkManager[860]: <info>  [1769436284.6441] hostname: hostname: using hostnamed
Jan 26 14:04:44 np0005595786.novalocal NetworkManager[860]: <info>  [1769436284.6442] hostname: static hostname changed from (none) to "np0005595786.novalocal"
Jan 26 14:04:44 np0005595786.novalocal NetworkManager[860]: <info>  [1769436284.6445] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 26 14:04:44 np0005595786.novalocal NetworkManager[860]: <info>  [1769436284.6556] manager[0x5565aed2c000]: rfkill: Wi-Fi hardware radio set enabled
Jan 26 14:04:44 np0005595786.novalocal NetworkManager[860]: <info>  [1769436284.6557] manager[0x5565aed2c000]: rfkill: WWAN hardware radio set enabled
Jan 26 14:04:44 np0005595786.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Jan 26 14:04:44 np0005595786.novalocal NetworkManager[860]: <info>  [1769436284.6591] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 26 14:04:44 np0005595786.novalocal NetworkManager[860]: <info>  [1769436284.6591] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 26 14:04:44 np0005595786.novalocal NetworkManager[860]: <info>  [1769436284.6591] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 26 14:04:44 np0005595786.novalocal NetworkManager[860]: <info>  [1769436284.6592] manager: Networking is enabled by state file
Jan 26 14:04:44 np0005595786.novalocal NetworkManager[860]: <info>  [1769436284.6593] settings: Loaded settings plugin: keyfile (internal)
Jan 26 14:04:44 np0005595786.novalocal NetworkManager[860]: <info>  [1769436284.6603] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 26 14:04:44 np0005595786.novalocal NetworkManager[860]: <info>  [1769436284.6618] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 26 14:04:44 np0005595786.novalocal NetworkManager[860]: <info>  [1769436284.6626] dhcp: init: Using DHCP client 'internal'
Jan 26 14:04:44 np0005595786.novalocal NetworkManager[860]: <info>  [1769436284.6628] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 26 14:04:44 np0005595786.novalocal NetworkManager[860]: <info>  [1769436284.6638] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 14:04:44 np0005595786.novalocal NetworkManager[860]: <info>  [1769436284.6643] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 26 14:04:44 np0005595786.novalocal NetworkManager[860]: <info>  [1769436284.6649] device (lo): Activation: starting connection 'lo' (90cc952b-85d2-4ca7-a327-6d073fb6794e)
Jan 26 14:04:44 np0005595786.novalocal NetworkManager[860]: <info>  [1769436284.6659] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 26 14:04:44 np0005595786.novalocal NetworkManager[860]: <info>  [1769436284.6668] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 14:04:44 np0005595786.novalocal NetworkManager[860]: <info>  [1769436284.6706] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 26 14:04:44 np0005595786.novalocal NetworkManager[860]: <info>  [1769436284.6710] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 26 14:04:44 np0005595786.novalocal NetworkManager[860]: <info>  [1769436284.6712] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 26 14:04:44 np0005595786.novalocal NetworkManager[860]: <info>  [1769436284.6714] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 26 14:04:44 np0005595786.novalocal NetworkManager[860]: <info>  [1769436284.6716] device (eth0): carrier: link connected
Jan 26 14:04:44 np0005595786.novalocal NetworkManager[860]: <info>  [1769436284.6718] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 26 14:04:44 np0005595786.novalocal NetworkManager[860]: <info>  [1769436284.6723] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 26 14:04:44 np0005595786.novalocal NetworkManager[860]: <info>  [1769436284.6729] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 26 14:04:44 np0005595786.novalocal NetworkManager[860]: <info>  [1769436284.6733] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 26 14:04:44 np0005595786.novalocal NetworkManager[860]: <info>  [1769436284.6733] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 14:04:44 np0005595786.novalocal NetworkManager[860]: <info>  [1769436284.6735] manager: NetworkManager state is now CONNECTING
Jan 26 14:04:44 np0005595786.novalocal NetworkManager[860]: <info>  [1769436284.6736] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 14:04:44 np0005595786.novalocal NetworkManager[860]: <info>  [1769436284.6741] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 14:04:44 np0005595786.novalocal NetworkManager[860]: <info>  [1769436284.6744] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 26 14:04:44 np0005595786.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 26 14:04:44 np0005595786.novalocal systemd[1]: Started Network Manager.
Jan 26 14:04:44 np0005595786.novalocal systemd[1]: Reached target Network.
Jan 26 14:04:44 np0005595786.novalocal systemd[1]: Starting Network Manager Wait Online...
Jan 26 14:04:44 np0005595786.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Jan 26 14:04:44 np0005595786.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 26 14:04:44 np0005595786.novalocal NetworkManager[860]: <info>  [1769436284.6935] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 26 14:04:44 np0005595786.novalocal NetworkManager[860]: <info>  [1769436284.6937] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 26 14:04:44 np0005595786.novalocal NetworkManager[860]: <info>  [1769436284.6941] device (lo): Activation: successful, device activated.
Jan 26 14:04:44 np0005595786.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Jan 26 14:04:44 np0005595786.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 26 14:04:44 np0005595786.novalocal systemd[1]: Reached target NFS client services.
Jan 26 14:04:44 np0005595786.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Jan 26 14:04:44 np0005595786.novalocal systemd[1]: Reached target Remote File Systems.
Jan 26 14:04:44 np0005595786.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 26 14:04:45 np0005595786.novalocal NetworkManager[860]: <info>  [1769436285.0962] dhcp4 (eth0): state changed new lease, address=38.102.83.217
Jan 26 14:04:45 np0005595786.novalocal NetworkManager[860]: <info>  [1769436285.0981] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 26 14:04:45 np0005595786.novalocal NetworkManager[860]: <info>  [1769436285.1017] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 14:04:45 np0005595786.novalocal NetworkManager[860]: <info>  [1769436285.1050] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 14:04:45 np0005595786.novalocal NetworkManager[860]: <info>  [1769436285.1053] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 14:04:45 np0005595786.novalocal NetworkManager[860]: <info>  [1769436285.1061] manager: NetworkManager state is now CONNECTED_SITE
Jan 26 14:04:45 np0005595786.novalocal NetworkManager[860]: <info>  [1769436285.1065] device (eth0): Activation: successful, device activated.
Jan 26 14:04:45 np0005595786.novalocal NetworkManager[860]: <info>  [1769436285.1076] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 26 14:04:45 np0005595786.novalocal NetworkManager[860]: <info>  [1769436285.1079] manager: startup complete
Jan 26 14:04:45 np0005595786.novalocal systemd[1]: Finished Network Manager Wait Online.
Jan 26 14:04:45 np0005595786.novalocal systemd[1]: Starting Cloud-init: Network Stage...
Jan 26 14:04:45 np0005595786.novalocal cloud-init[924]: Cloud-init v. 24.4-8.el9 running 'init' at Mon, 26 Jan 2026 14:04:45 +0000. Up 8.06 seconds.
Jan 26 14:04:45 np0005595786.novalocal cloud-init[924]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Jan 26 14:04:45 np0005595786.novalocal cloud-init[924]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 26 14:04:45 np0005595786.novalocal cloud-init[924]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Jan 26 14:04:45 np0005595786.novalocal cloud-init[924]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 26 14:04:45 np0005595786.novalocal cloud-init[924]: ci-info: |  eth0  | True |        38.102.83.217         | 255.255.255.0 | global | fa:16:3e:2e:19:91 |
Jan 26 14:04:45 np0005595786.novalocal cloud-init[924]: ci-info: |  eth0  | True | fe80::f816:3eff:fe2e:1991/64 |       .       |  link  | fa:16:3e:2e:19:91 |
Jan 26 14:04:45 np0005595786.novalocal cloud-init[924]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Jan 26 14:04:45 np0005595786.novalocal cloud-init[924]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Jan 26 14:04:45 np0005595786.novalocal cloud-init[924]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 26 14:04:45 np0005595786.novalocal cloud-init[924]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Jan 26 14:04:45 np0005595786.novalocal cloud-init[924]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 26 14:04:45 np0005595786.novalocal cloud-init[924]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Jan 26 14:04:45 np0005595786.novalocal cloud-init[924]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 26 14:04:45 np0005595786.novalocal cloud-init[924]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Jan 26 14:04:45 np0005595786.novalocal cloud-init[924]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Jan 26 14:04:45 np0005595786.novalocal cloud-init[924]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Jan 26 14:04:45 np0005595786.novalocal cloud-init[924]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 26 14:04:45 np0005595786.novalocal cloud-init[924]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Jan 26 14:04:45 np0005595786.novalocal cloud-init[924]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 26 14:04:45 np0005595786.novalocal cloud-init[924]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Jan 26 14:04:45 np0005595786.novalocal cloud-init[924]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 26 14:04:45 np0005595786.novalocal cloud-init[924]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Jan 26 14:04:45 np0005595786.novalocal cloud-init[924]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Jan 26 14:04:45 np0005595786.novalocal cloud-init[924]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 26 14:04:46 np0005595786.novalocal useradd[991]: new group: name=cloud-user, GID=1001
Jan 26 14:04:46 np0005595786.novalocal useradd[991]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Jan 26 14:04:46 np0005595786.novalocal useradd[991]: add 'cloud-user' to group 'adm'
Jan 26 14:04:46 np0005595786.novalocal useradd[991]: add 'cloud-user' to group 'systemd-journal'
Jan 26 14:04:46 np0005595786.novalocal useradd[991]: add 'cloud-user' to shadow group 'adm'
Jan 26 14:04:46 np0005595786.novalocal useradd[991]: add 'cloud-user' to shadow group 'systemd-journal'
Jan 26 14:04:46 np0005595786.novalocal cloud-init[924]: Generating public/private rsa key pair.
Jan 26 14:04:46 np0005595786.novalocal cloud-init[924]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Jan 26 14:04:46 np0005595786.novalocal cloud-init[924]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Jan 26 14:04:46 np0005595786.novalocal cloud-init[924]: The key fingerprint is:
Jan 26 14:04:46 np0005595786.novalocal cloud-init[924]: SHA256:UiEaDy5s4YECZpDPSzIsZzTmgR/8q7LGGFE1tZEAxIc root@np0005595786.novalocal
Jan 26 14:04:46 np0005595786.novalocal cloud-init[924]: The key's randomart image is:
Jan 26 14:04:46 np0005595786.novalocal cloud-init[924]: +---[RSA 3072]----+
Jan 26 14:04:46 np0005595786.novalocal cloud-init[924]: |=X=+*o+..        |
Jan 26 14:04:46 np0005595786.novalocal cloud-init[924]: |BoE+.*.+ .       |
Jan 26 14:04:46 np0005595786.novalocal cloud-init[924]: |oO=*o o .        |
Jan 26 14:04:46 np0005595786.novalocal cloud-init[924]: |=oO..  .         |
Jan 26 14:04:46 np0005595786.novalocal cloud-init[924]: |.B . .. S        |
Jan 26 14:04:46 np0005595786.novalocal cloud-init[924]: |. . .  .         |
Jan 26 14:04:46 np0005595786.novalocal cloud-init[924]: |o. .             |
Jan 26 14:04:46 np0005595786.novalocal cloud-init[924]: |oo.              |
Jan 26 14:04:46 np0005595786.novalocal cloud-init[924]: |oo               |
Jan 26 14:04:46 np0005595786.novalocal cloud-init[924]: +----[SHA256]-----+
Jan 26 14:04:46 np0005595786.novalocal cloud-init[924]: Generating public/private ecdsa key pair.
Jan 26 14:04:46 np0005595786.novalocal cloud-init[924]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Jan 26 14:04:46 np0005595786.novalocal cloud-init[924]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Jan 26 14:04:46 np0005595786.novalocal cloud-init[924]: The key fingerprint is:
Jan 26 14:04:46 np0005595786.novalocal cloud-init[924]: SHA256:S3O+Gv7rxPBvx6Pp/JJ3BQ0wq3hAWfTBQyHyTYlMfpc root@np0005595786.novalocal
Jan 26 14:04:46 np0005595786.novalocal cloud-init[924]: The key's randomart image is:
Jan 26 14:04:46 np0005595786.novalocal cloud-init[924]: +---[ECDSA 256]---+
Jan 26 14:04:46 np0005595786.novalocal cloud-init[924]: |        o*=+B+   |
Jan 26 14:04:46 np0005595786.novalocal cloud-init[924]: |       ..+o=++.. |
Jan 26 14:04:46 np0005595786.novalocal cloud-init[924]: |        . o =.Eo |
Jan 26 14:04:46 np0005595786.novalocal cloud-init[924]: |         o o .. .|
Jan 26 14:04:46 np0005595786.novalocal cloud-init[924]: |        S +    . |
Jan 26 14:04:46 np0005595786.novalocal cloud-init[924]: |       . X      .|
Jan 26 14:04:46 np0005595786.novalocal cloud-init[924]: |        o =  o  .|
Jan 26 14:04:46 np0005595786.novalocal cloud-init[924]: |       . o ++.= .|
Jan 26 14:04:46 np0005595786.novalocal cloud-init[924]: |        o+=+**.o |
Jan 26 14:04:46 np0005595786.novalocal cloud-init[924]: +----[SHA256]-----+
Jan 26 14:04:46 np0005595786.novalocal cloud-init[924]: Generating public/private ed25519 key pair.
Jan 26 14:04:46 np0005595786.novalocal cloud-init[924]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Jan 26 14:04:46 np0005595786.novalocal cloud-init[924]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Jan 26 14:04:46 np0005595786.novalocal cloud-init[924]: The key fingerprint is:
Jan 26 14:04:46 np0005595786.novalocal cloud-init[924]: SHA256:U+QK4XW5LDHTD3uweEnWFdUcjbyui9MGJz3HVe5tCMY root@np0005595786.novalocal
Jan 26 14:04:46 np0005595786.novalocal cloud-init[924]: The key's randomart image is:
Jan 26 14:04:46 np0005595786.novalocal cloud-init[924]: +--[ED25519 256]--+
Jan 26 14:04:46 np0005595786.novalocal cloud-init[924]: |      . ..oo oo*=|
Jan 26 14:04:46 np0005595786.novalocal cloud-init[924]: |     . o++B . o =|
Jan 26 14:04:46 np0005595786.novalocal cloud-init[924]: |      o  OoX   o.|
Jan 26 14:04:46 np0005595786.novalocal cloud-init[924]: |       .ooB E . o|
Jan 26 14:04:46 np0005595786.novalocal cloud-init[924]: |        So + + +.|
Jan 26 14:04:46 np0005595786.novalocal cloud-init[924]: |         .o + = +|
Jan 26 14:04:46 np0005595786.novalocal cloud-init[924]: |           = + . |
Jan 26 14:04:46 np0005595786.novalocal cloud-init[924]: |          ..+    |
Jan 26 14:04:46 np0005595786.novalocal cloud-init[924]: |          .o..   |
Jan 26 14:04:46 np0005595786.novalocal cloud-init[924]: +----[SHA256]-----+
Jan 26 14:04:46 np0005595786.novalocal sm-notify[1007]: Version 2.5.4 starting
Jan 26 14:04:46 np0005595786.novalocal systemd[1]: Finished Cloud-init: Network Stage.
Jan 26 14:04:46 np0005595786.novalocal systemd[1]: Reached target Cloud-config availability.
Jan 26 14:04:46 np0005595786.novalocal systemd[1]: Reached target Network is Online.
Jan 26 14:04:46 np0005595786.novalocal systemd[1]: Starting Cloud-init: Config Stage...
Jan 26 14:04:46 np0005595786.novalocal systemd[1]: Starting Crash recovery kernel arming...
Jan 26 14:04:46 np0005595786.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Jan 26 14:04:46 np0005595786.novalocal systemd[1]: Starting System Logging Service...
Jan 26 14:04:46 np0005595786.novalocal systemd[1]: Starting OpenSSH server daemon...
Jan 26 14:04:46 np0005595786.novalocal systemd[1]: Starting Permit User Sessions...
Jan 26 14:04:46 np0005595786.novalocal systemd[1]: Started Notify NFS peers of a restart.
Jan 26 14:04:46 np0005595786.novalocal sshd[1009]: Server listening on 0.0.0.0 port 22.
Jan 26 14:04:46 np0005595786.novalocal sshd[1009]: Server listening on :: port 22.
Jan 26 14:04:46 np0005595786.novalocal systemd[1]: Started OpenSSH server daemon.
Jan 26 14:04:46 np0005595786.novalocal systemd[1]: Finished Permit User Sessions.
Jan 26 14:04:46 np0005595786.novalocal systemd[1]: Started Command Scheduler.
Jan 26 14:04:46 np0005595786.novalocal systemd[1]: Started Getty on tty1.
Jan 26 14:04:46 np0005595786.novalocal systemd[1]: Started Serial Getty on ttyS0.
Jan 26 14:04:46 np0005595786.novalocal systemd[1]: Reached target Login Prompts.
Jan 26 14:04:46 np0005595786.novalocal crond[1012]: (CRON) STARTUP (1.5.7)
Jan 26 14:04:46 np0005595786.novalocal crond[1012]: (CRON) INFO (Syslog will be used instead of sendmail.)
Jan 26 14:04:46 np0005595786.novalocal crond[1012]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 49% if used.)
Jan 26 14:04:46 np0005595786.novalocal crond[1012]: (CRON) INFO (running with inotify support)
Jan 26 14:04:46 np0005595786.novalocal rsyslogd[1008]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1008" x-info="https://www.rsyslog.com"] start
Jan 26 14:04:46 np0005595786.novalocal systemd[1]: Started System Logging Service.
Jan 26 14:04:46 np0005595786.novalocal rsyslogd[1008]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Jan 26 14:04:46 np0005595786.novalocal systemd[1]: Reached target Multi-User System.
Jan 26 14:04:46 np0005595786.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Jan 26 14:04:46 np0005595786.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Jan 26 14:04:46 np0005595786.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Jan 26 14:04:46 np0005595786.novalocal rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 14:04:46 np0005595786.novalocal kdumpctl[1017]: kdump: No kdump initial ramdisk found.
Jan 26 14:04:46 np0005595786.novalocal kdumpctl[1017]: kdump: Rebuilding /boot/initramfs-5.14.0-661.el9.x86_64kdump.img
Jan 26 14:04:47 np0005595786.novalocal cloud-init[1106]: Cloud-init v. 24.4-8.el9 running 'modules:config' at Mon, 26 Jan 2026 14:04:46 +0000. Up 9.65 seconds.
Jan 26 14:04:47 np0005595786.novalocal systemd[1]: Finished Cloud-init: Config Stage.
Jan 26 14:04:47 np0005595786.novalocal systemd[1]: Starting Cloud-init: Final Stage...
Jan 26 14:04:47 np0005595786.novalocal cloud-init[1263]: Cloud-init v. 24.4-8.el9 running 'modules:final' at Mon, 26 Jan 2026 14:04:47 +0000. Up 10.02 seconds.
Jan 26 14:04:47 np0005595786.novalocal dracut[1271]: dracut-057-102.git20250818.el9
Jan 26 14:04:47 np0005595786.novalocal cloud-init[1275]: #############################################################
Jan 26 14:04:47 np0005595786.novalocal cloud-init[1278]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Jan 26 14:04:47 np0005595786.novalocal cloud-init[1291]: 256 SHA256:S3O+Gv7rxPBvx6Pp/JJ3BQ0wq3hAWfTBQyHyTYlMfpc root@np0005595786.novalocal (ECDSA)
Jan 26 14:04:47 np0005595786.novalocal cloud-init[1293]: 256 SHA256:U+QK4XW5LDHTD3uweEnWFdUcjbyui9MGJz3HVe5tCMY root@np0005595786.novalocal (ED25519)
Jan 26 14:04:47 np0005595786.novalocal cloud-init[1295]: 3072 SHA256:UiEaDy5s4YECZpDPSzIsZzTmgR/8q7LGGFE1tZEAxIc root@np0005595786.novalocal (RSA)
Jan 26 14:04:47 np0005595786.novalocal cloud-init[1296]: -----END SSH HOST KEY FINGERPRINTS-----
Jan 26 14:04:47 np0005595786.novalocal cloud-init[1297]: #############################################################
Jan 26 14:04:47 np0005595786.novalocal cloud-init[1263]: Cloud-init v. 24.4-8.el9 finished at Mon, 26 Jan 2026 14:04:47 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 10.22 seconds
Jan 26 14:04:47 np0005595786.novalocal sshd-session[1301]: Connection reset by 38.102.83.114 port 54474 [preauth]
Jan 26 14:04:47 np0005595786.novalocal sshd-session[1305]: Unable to negotiate with 38.102.83.114 port 54480: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Jan 26 14:04:47 np0005595786.novalocal dracut[1274]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-661.el9.x86_64kdump.img 5.14.0-661.el9.x86_64
Jan 26 14:04:47 np0005595786.novalocal sshd-session[1314]: Connection reset by 38.102.83.114 port 54492 [preauth]
Jan 26 14:04:47 np0005595786.novalocal systemd[1]: Finished Cloud-init: Final Stage.
Jan 26 14:04:47 np0005595786.novalocal systemd[1]: Reached target Cloud-init target.
Jan 26 14:04:47 np0005595786.novalocal sshd-session[1326]: Unable to negotiate with 38.102.83.114 port 54494: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Jan 26 14:04:47 np0005595786.novalocal sshd-session[1332]: Unable to negotiate with 38.102.83.114 port 54508: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Jan 26 14:04:47 np0005595786.novalocal sshd-session[1357]: Connection reset by 38.102.83.114 port 54528 [preauth]
Jan 26 14:04:47 np0005595786.novalocal sshd-session[1367]: Unable to negotiate with 38.102.83.114 port 54540: no matching host key type found. Their offer: ssh-rsa,ssh-rsa-cert-v01@openssh.com [preauth]
Jan 26 14:04:47 np0005595786.novalocal sshd-session[1369]: Unable to negotiate with 38.102.83.114 port 54550: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Jan 26 14:04:47 np0005595786.novalocal sshd-session[1343]: Connection closed by 38.102.83.114 port 54516 [preauth]
Jan 26 14:04:48 np0005595786.novalocal dracut[1274]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Jan 26 14:04:48 np0005595786.novalocal dracut[1274]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Jan 26 14:04:48 np0005595786.novalocal dracut[1274]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Jan 26 14:04:48 np0005595786.novalocal dracut[1274]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 26 14:04:48 np0005595786.novalocal dracut[1274]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 26 14:04:48 np0005595786.novalocal dracut[1274]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 26 14:04:48 np0005595786.novalocal dracut[1274]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 26 14:04:48 np0005595786.novalocal dracut[1274]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 26 14:04:48 np0005595786.novalocal dracut[1274]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 26 14:04:48 np0005595786.novalocal dracut[1274]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 26 14:04:48 np0005595786.novalocal dracut[1274]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 26 14:04:48 np0005595786.novalocal dracut[1274]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 26 14:04:48 np0005595786.novalocal dracut[1274]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 26 14:04:48 np0005595786.novalocal dracut[1274]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 26 14:04:48 np0005595786.novalocal dracut[1274]: Module 'ifcfg' will not be installed, because it's in the list to be omitted!
Jan 26 14:04:48 np0005595786.novalocal dracut[1274]: Module 'plymouth' will not be installed, because it's in the list to be omitted!
Jan 26 14:04:48 np0005595786.novalocal dracut[1274]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 26 14:04:48 np0005595786.novalocal dracut[1274]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 26 14:04:48 np0005595786.novalocal dracut[1274]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 26 14:04:48 np0005595786.novalocal dracut[1274]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 26 14:04:48 np0005595786.novalocal dracut[1274]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 26 14:04:48 np0005595786.novalocal dracut[1274]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 26 14:04:48 np0005595786.novalocal dracut[1274]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 26 14:04:48 np0005595786.novalocal dracut[1274]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 26 14:04:48 np0005595786.novalocal dracut[1274]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 26 14:04:48 np0005595786.novalocal dracut[1274]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 26 14:04:48 np0005595786.novalocal dracut[1274]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 26 14:04:48 np0005595786.novalocal dracut[1274]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 26 14:04:48 np0005595786.novalocal dracut[1274]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 26 14:04:48 np0005595786.novalocal dracut[1274]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 26 14:04:48 np0005595786.novalocal dracut[1274]: Module 'resume' will not be installed, because it's in the list to be omitted!
Jan 26 14:04:48 np0005595786.novalocal dracut[1274]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Jan 26 14:04:48 np0005595786.novalocal dracut[1274]: Module 'earlykdump' will not be installed, because it's in the list to be omitted!
Jan 26 14:04:48 np0005595786.novalocal dracut[1274]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 26 14:04:48 np0005595786.novalocal dracut[1274]: memstrack is not available
Jan 26 14:04:48 np0005595786.novalocal dracut[1274]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 26 14:04:49 np0005595786.novalocal dracut[1274]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 26 14:04:49 np0005595786.novalocal dracut[1274]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 26 14:04:49 np0005595786.novalocal dracut[1274]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 26 14:04:49 np0005595786.novalocal dracut[1274]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 26 14:04:49 np0005595786.novalocal dracut[1274]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 26 14:04:49 np0005595786.novalocal dracut[1274]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 26 14:04:49 np0005595786.novalocal dracut[1274]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 26 14:04:49 np0005595786.novalocal dracut[1274]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 26 14:04:49 np0005595786.novalocal dracut[1274]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 26 14:04:49 np0005595786.novalocal dracut[1274]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 26 14:04:49 np0005595786.novalocal dracut[1274]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 26 14:04:49 np0005595786.novalocal dracut[1274]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 26 14:04:49 np0005595786.novalocal dracut[1274]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 26 14:04:49 np0005595786.novalocal dracut[1274]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 26 14:04:49 np0005595786.novalocal dracut[1274]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 26 14:04:49 np0005595786.novalocal dracut[1274]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 26 14:04:49 np0005595786.novalocal dracut[1274]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 26 14:04:49 np0005595786.novalocal dracut[1274]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 26 14:04:49 np0005595786.novalocal dracut[1274]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 26 14:04:49 np0005595786.novalocal dracut[1274]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 26 14:04:49 np0005595786.novalocal dracut[1274]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 26 14:04:49 np0005595786.novalocal dracut[1274]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 26 14:04:49 np0005595786.novalocal dracut[1274]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 26 14:04:49 np0005595786.novalocal dracut[1274]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 26 14:04:49 np0005595786.novalocal dracut[1274]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 26 14:04:49 np0005595786.novalocal dracut[1274]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 26 14:04:49 np0005595786.novalocal dracut[1274]: memstrack is not available
Jan 26 14:04:49 np0005595786.novalocal dracut[1274]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 26 14:04:49 np0005595786.novalocal dracut[1274]: *** Including module: systemd ***
Jan 26 14:04:49 np0005595786.novalocal dracut[1274]: *** Including module: fips ***
Jan 26 14:04:50 np0005595786.novalocal chronyd[792]: Selected source 206.108.0.131 (2.centos.pool.ntp.org)
Jan 26 14:04:50 np0005595786.novalocal chronyd[792]: System clock TAI offset set to 37 seconds
Jan 26 14:04:50 np0005595786.novalocal dracut[1274]: *** Including module: systemd-initrd ***
Jan 26 14:04:50 np0005595786.novalocal dracut[1274]: *** Including module: i18n ***
Jan 26 14:04:50 np0005595786.novalocal dracut[1274]: *** Including module: drm ***
Jan 26 14:04:50 np0005595786.novalocal dracut[1274]: *** Including module: prefixdevname ***
Jan 26 14:04:50 np0005595786.novalocal dracut[1274]: *** Including module: kernel-modules ***
Jan 26 14:04:51 np0005595786.novalocal kernel: block vda: the capability attribute has been deprecated.
Jan 26 14:04:51 np0005595786.novalocal dracut[1274]: *** Including module: kernel-modules-extra ***
Jan 26 14:04:51 np0005595786.novalocal dracut[1274]:   kernel-modules-extra: configuration source "/run/depmod.d" does not exist
Jan 26 14:04:51 np0005595786.novalocal dracut[1274]:   kernel-modules-extra: configuration source "/lib/depmod.d" does not exist
Jan 26 14:04:51 np0005595786.novalocal dracut[1274]:   kernel-modules-extra: parsing configuration file "/etc/depmod.d/dist.conf"
Jan 26 14:04:51 np0005595786.novalocal dracut[1274]:   kernel-modules-extra: /etc/depmod.d/dist.conf: added "updates extra built-in weak-updates" to the list of search directories
Jan 26 14:04:51 np0005595786.novalocal dracut[1274]: *** Including module: qemu ***
Jan 26 14:04:51 np0005595786.novalocal dracut[1274]: *** Including module: fstab-sys ***
Jan 26 14:04:51 np0005595786.novalocal dracut[1274]: *** Including module: rootfs-block ***
Jan 26 14:04:51 np0005595786.novalocal dracut[1274]: *** Including module: terminfo ***
Jan 26 14:04:51 np0005595786.novalocal dracut[1274]: *** Including module: udev-rules ***
Jan 26 14:04:52 np0005595786.novalocal dracut[1274]: Skipping udev rule: 91-permissions.rules
Jan 26 14:04:52 np0005595786.novalocal dracut[1274]: Skipping udev rule: 80-drivers-modprobe.rules
Jan 26 14:04:52 np0005595786.novalocal dracut[1274]: *** Including module: virtiofs ***
Jan 26 14:04:52 np0005595786.novalocal dracut[1274]: *** Including module: dracut-systemd ***
Jan 26 14:04:52 np0005595786.novalocal dracut[1274]: *** Including module: usrmount ***
Jan 26 14:04:52 np0005595786.novalocal dracut[1274]: *** Including module: base ***
Jan 26 14:04:53 np0005595786.novalocal dracut[1274]: *** Including module: fs-lib ***
Jan 26 14:04:53 np0005595786.novalocal dracut[1274]: *** Including module: kdumpbase ***
Jan 26 14:04:53 np0005595786.novalocal dracut[1274]: *** Including module: microcode_ctl-fw_dir_override ***
Jan 26 14:04:53 np0005595786.novalocal dracut[1274]:   microcode_ctl module: mangling fw_dir
Jan 26 14:04:53 np0005595786.novalocal dracut[1274]:     microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Jan 26 14:04:53 np0005595786.novalocal dracut[1274]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Jan 26 14:04:53 np0005595786.novalocal dracut[1274]:     microcode_ctl: configuration "intel" is ignored
Jan 26 14:04:53 np0005595786.novalocal dracut[1274]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Jan 26 14:04:53 np0005595786.novalocal dracut[1274]:     microcode_ctl: configuration "intel-06-2d-07" is ignored
Jan 26 14:04:53 np0005595786.novalocal dracut[1274]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Jan 26 14:04:53 np0005595786.novalocal dracut[1274]:     microcode_ctl: configuration "intel-06-4e-03" is ignored
Jan 26 14:04:53 np0005595786.novalocal dracut[1274]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Jan 26 14:04:53 np0005595786.novalocal dracut[1274]:     microcode_ctl: configuration "intel-06-4f-01" is ignored
Jan 26 14:04:53 np0005595786.novalocal dracut[1274]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Jan 26 14:04:53 np0005595786.novalocal dracut[1274]:     microcode_ctl: configuration "intel-06-55-04" is ignored
Jan 26 14:04:53 np0005595786.novalocal dracut[1274]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Jan 26 14:04:53 np0005595786.novalocal dracut[1274]:     microcode_ctl: configuration "intel-06-5e-03" is ignored
Jan 26 14:04:53 np0005595786.novalocal dracut[1274]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Jan 26 14:04:53 np0005595786.novalocal dracut[1274]:     microcode_ctl: configuration "intel-06-8c-01" is ignored
Jan 26 14:04:53 np0005595786.novalocal dracut[1274]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Jan 26 14:04:54 np0005595786.novalocal dracut[1274]:     microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Jan 26 14:04:54 np0005595786.novalocal dracut[1274]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Jan 26 14:04:54 np0005595786.novalocal dracut[1274]:     microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Jan 26 14:04:54 np0005595786.novalocal dracut[1274]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Jan 26 14:04:54 np0005595786.novalocal dracut[1274]:     microcode_ctl: configuration "intel-06-8f-08" is ignored
Jan 26 14:04:54 np0005595786.novalocal dracut[1274]:     microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Jan 26 14:04:54 np0005595786.novalocal dracut[1274]: *** Including module: openssl ***
Jan 26 14:04:54 np0005595786.novalocal dracut[1274]: *** Including module: shutdown ***
Jan 26 14:04:54 np0005595786.novalocal dracut[1274]: *** Including module: squash ***
Jan 26 14:04:54 np0005595786.novalocal dracut[1274]: *** Including modules done ***
Jan 26 14:04:54 np0005595786.novalocal dracut[1274]: *** Installing kernel module dependencies ***
Jan 26 14:04:54 np0005595786.novalocal irqbalance[790]: Cannot change IRQ 25 affinity: Operation not permitted
Jan 26 14:04:54 np0005595786.novalocal irqbalance[790]: IRQ 25 affinity is now unmanaged
Jan 26 14:04:54 np0005595786.novalocal irqbalance[790]: Cannot change IRQ 31 affinity: Operation not permitted
Jan 26 14:04:54 np0005595786.novalocal irqbalance[790]: IRQ 31 affinity is now unmanaged
Jan 26 14:04:54 np0005595786.novalocal irqbalance[790]: Cannot change IRQ 28 affinity: Operation not permitted
Jan 26 14:04:54 np0005595786.novalocal irqbalance[790]: IRQ 28 affinity is now unmanaged
Jan 26 14:04:54 np0005595786.novalocal irqbalance[790]: Cannot change IRQ 32 affinity: Operation not permitted
Jan 26 14:04:54 np0005595786.novalocal irqbalance[790]: IRQ 32 affinity is now unmanaged
Jan 26 14:04:54 np0005595786.novalocal irqbalance[790]: Cannot change IRQ 30 affinity: Operation not permitted
Jan 26 14:04:54 np0005595786.novalocal irqbalance[790]: IRQ 30 affinity is now unmanaged
Jan 26 14:04:54 np0005595786.novalocal irqbalance[790]: Cannot change IRQ 29 affinity: Operation not permitted
Jan 26 14:04:54 np0005595786.novalocal irqbalance[790]: IRQ 29 affinity is now unmanaged
Jan 26 14:04:55 np0005595786.novalocal dracut[1274]: *** Installing kernel module dependencies done ***
Jan 26 14:04:55 np0005595786.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 26 14:04:55 np0005595786.novalocal dracut[1274]: *** Resolving executable dependencies ***
Jan 26 14:04:57 np0005595786.novalocal dracut[1274]: *** Resolving executable dependencies done ***
Jan 26 14:04:57 np0005595786.novalocal dracut[1274]: *** Generating early-microcode cpio image ***
Jan 26 14:04:57 np0005595786.novalocal dracut[1274]: *** Store current command line parameters ***
Jan 26 14:04:57 np0005595786.novalocal dracut[1274]: Stored kernel commandline:
Jan 26 14:04:57 np0005595786.novalocal dracut[1274]: No dracut internal kernel commandline stored in the initramfs
Jan 26 14:04:57 np0005595786.novalocal dracut[1274]: *** Install squash loader ***
Jan 26 14:04:58 np0005595786.novalocal dracut[1274]: *** Squashing the files inside the initramfs ***
Jan 26 14:04:59 np0005595786.novalocal dracut[1274]: *** Squashing the files inside the initramfs done ***
Jan 26 14:04:59 np0005595786.novalocal dracut[1274]: *** Creating image file '/boot/initramfs-5.14.0-661.el9.x86_64kdump.img' ***
Jan 26 14:04:59 np0005595786.novalocal dracut[1274]: *** Hardlinking files ***
Jan 26 14:04:59 np0005595786.novalocal dracut[1274]: Mode:           real
Jan 26 14:04:59 np0005595786.novalocal dracut[1274]: Files:          50
Jan 26 14:04:59 np0005595786.novalocal dracut[1274]: Linked:         0 files
Jan 26 14:04:59 np0005595786.novalocal dracut[1274]: Compared:       0 xattrs
Jan 26 14:04:59 np0005595786.novalocal dracut[1274]: Compared:       0 files
Jan 26 14:04:59 np0005595786.novalocal dracut[1274]: Saved:          0 B
Jan 26 14:04:59 np0005595786.novalocal dracut[1274]: Duration:       0.000504 seconds
Jan 26 14:04:59 np0005595786.novalocal dracut[1274]: *** Hardlinking files done ***
Jan 26 14:04:59 np0005595786.novalocal dracut[1274]: *** Creating initramfs image file '/boot/initramfs-5.14.0-661.el9.x86_64kdump.img' done ***
Jan 26 14:05:00 np0005595786.novalocal kdumpctl[1017]: kdump: kexec: loaded kdump kernel
Jan 26 14:05:00 np0005595786.novalocal kdumpctl[1017]: kdump: Starting kdump: [OK]
Jan 26 14:05:00 np0005595786.novalocal systemd[1]: Finished Crash recovery kernel arming.
Jan 26 14:05:00 np0005595786.novalocal systemd[1]: Startup finished in 1.669s (kernel) + 3.006s (initrd) + 18.308s (userspace) = 22.984s.
Jan 26 14:05:11 np0005595786.novalocal sshd-session[4305]: Accepted publickey for zuul from 38.102.83.114 port 45856 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Jan 26 14:05:11 np0005595786.novalocal systemd[1]: Created slice User Slice of UID 1000.
Jan 26 14:05:11 np0005595786.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Jan 26 14:05:11 np0005595786.novalocal systemd-logind[795]: New session 1 of user zuul.
Jan 26 14:05:11 np0005595786.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Jan 26 14:05:11 np0005595786.novalocal systemd[1]: Starting User Manager for UID 1000...
Jan 26 14:05:11 np0005595786.novalocal systemd[4309]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 14:05:11 np0005595786.novalocal systemd[4309]: Queued start job for default target Main User Target.
Jan 26 14:05:11 np0005595786.novalocal systemd[4309]: Created slice User Application Slice.
Jan 26 14:05:11 np0005595786.novalocal systemd[4309]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 26 14:05:11 np0005595786.novalocal systemd[4309]: Started Daily Cleanup of User's Temporary Directories.
Jan 26 14:05:11 np0005595786.novalocal systemd[4309]: Reached target Paths.
Jan 26 14:05:11 np0005595786.novalocal systemd[4309]: Reached target Timers.
Jan 26 14:05:11 np0005595786.novalocal systemd[4309]: Starting D-Bus User Message Bus Socket...
Jan 26 14:05:11 np0005595786.novalocal systemd[4309]: Starting Create User's Volatile Files and Directories...
Jan 26 14:05:11 np0005595786.novalocal systemd[4309]: Finished Create User's Volatile Files and Directories.
Jan 26 14:05:11 np0005595786.novalocal systemd[4309]: Listening on D-Bus User Message Bus Socket.
Jan 26 14:05:11 np0005595786.novalocal systemd[4309]: Reached target Sockets.
Jan 26 14:05:11 np0005595786.novalocal systemd[4309]: Reached target Basic System.
Jan 26 14:05:11 np0005595786.novalocal systemd[4309]: Reached target Main User Target.
Jan 26 14:05:11 np0005595786.novalocal systemd[4309]: Startup finished in 134ms.
Jan 26 14:05:11 np0005595786.novalocal systemd[1]: Started User Manager for UID 1000.
Jan 26 14:05:11 np0005595786.novalocal systemd[1]: Started Session 1 of User zuul.
Jan 26 14:05:11 np0005595786.novalocal sshd-session[4305]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 14:05:12 np0005595786.novalocal python3[4392]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 14:05:14 np0005595786.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 26 14:05:15 np0005595786.novalocal python3[4422]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 14:05:21 np0005595786.novalocal python3[4480]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 14:05:22 np0005595786.novalocal python3[4520]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Jan 26 14:05:24 np0005595786.novalocal python3[4546]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC6q6y5VlUogE2wzsFEyhHoovFc1bpmlaeYY8RbQzXyU4D/CRWWefr9z7cNz1KKZFXP2PW2aB0yodNdGfRrbXfcxKJSpOOrWkiA8zkHEzC15kCaE+Q+di96X6BerT3UZHsl2wzHUDQrOV4bjYk986idouEMZtY2CecRlk4dC8jjSzYPjlQwVwp92gAF6rGSmeUZ6ov4iJq94q8vzW5PaRihXHMnXW/8zeTW2tHW+q9OVkszx9OnR09vcmGXzrafTkI8jenuk3hbFZxp6FiXdOPLzTL1VrWo/dPxkvQUfkUlMYMGJ2kNqfBWHAUyqzFoZCqjziNWR4HtM1hFbLx4ckow3ru9pf07SbQIDFI8MaNnJwWOK72YnlqAt3cOZO7FTGTGtfvAlsj4b7VodanhT4eZ01IA20B+YCxHdcAqR6eyacRlqx1P9BPrZFD513w3DRSd8DzhzlJFZkunIqRzGYUMUN/rGNI84n5Rd9OwnSpSzGRy9gLhxIjFKqOe4nU4FM0= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 14:05:24 np0005595786.novalocal python3[4570]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:05:25 np0005595786.novalocal python3[4669]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 14:05:25 np0005595786.novalocal python3[4740]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769436324.7281625-230-256714432939813/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=c77ccf35e1f1439db9f1a8a765b9909c_id_rsa follow=False checksum=718aefa1bf668c53c07dd6860881ecbe60adc304 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:05:26 np0005595786.novalocal python3[4863]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 14:05:26 np0005595786.novalocal python3[4934]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769436325.7784977-274-57124522468576/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=c77ccf35e1f1439db9f1a8a765b9909c_id_rsa.pub follow=False checksum=3578b6c90ac975d34fa7df971038fbbbdfd9dc26 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:05:27 np0005595786.novalocal python3[4982]: ansible-ping Invoked with data=pong
Jan 26 14:05:28 np0005595786.novalocal python3[5006]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 14:05:30 np0005595786.novalocal python3[5064]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Jan 26 14:05:32 np0005595786.novalocal python3[5096]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:05:32 np0005595786.novalocal python3[5120]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:05:32 np0005595786.novalocal python3[5144]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:05:33 np0005595786.novalocal python3[5168]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:05:33 np0005595786.novalocal python3[5192]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:05:33 np0005595786.novalocal python3[5216]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:05:34 np0005595786.novalocal sudo[5240]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofeiyzwdzeehufqhehvrcwfztqspxxgm ; /usr/bin/python3'
Jan 26 14:05:34 np0005595786.novalocal sudo[5240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:05:35 np0005595786.novalocal python3[5242]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:05:35 np0005595786.novalocal sudo[5240]: pam_unix(sudo:session): session closed for user root
Jan 26 14:05:35 np0005595786.novalocal sudo[5318]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tnnboevozzycpglffzwmoqwacppvreyc ; /usr/bin/python3'
Jan 26 14:05:35 np0005595786.novalocal sudo[5318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:05:35 np0005595786.novalocal python3[5320]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 14:05:35 np0005595786.novalocal sudo[5318]: pam_unix(sudo:session): session closed for user root
Jan 26 14:05:36 np0005595786.novalocal sudo[5391]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvwxotiyvqooqpuuclvgignmoxbkzkeu ; /usr/bin/python3'
Jan 26 14:05:36 np0005595786.novalocal sudo[5391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:05:36 np0005595786.novalocal python3[5393]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769436335.376002-28-246962326221754/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:05:36 np0005595786.novalocal sudo[5391]: pam_unix(sudo:session): session closed for user root
Jan 26 14:05:37 np0005595786.novalocal python3[5441]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 14:05:37 np0005595786.novalocal python3[5465]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 14:05:37 np0005595786.novalocal python3[5489]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 14:05:37 np0005595786.novalocal python3[5513]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 14:05:38 np0005595786.novalocal python3[5537]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 14:05:38 np0005595786.novalocal python3[5561]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 14:05:38 np0005595786.novalocal python3[5585]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 14:05:38 np0005595786.novalocal python3[5609]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 14:05:39 np0005595786.novalocal python3[5633]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 14:05:39 np0005595786.novalocal python3[5657]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 14:05:39 np0005595786.novalocal python3[5681]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 14:05:40 np0005595786.novalocal python3[5705]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 14:05:40 np0005595786.novalocal python3[5729]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 14:05:40 np0005595786.novalocal python3[5753]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 14:05:41 np0005595786.novalocal python3[5777]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 14:05:41 np0005595786.novalocal python3[5801]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 14:05:41 np0005595786.novalocal python3[5825]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 14:05:41 np0005595786.novalocal python3[5849]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 14:05:42 np0005595786.novalocal python3[5873]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 14:05:42 np0005595786.novalocal python3[5897]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 14:05:42 np0005595786.novalocal python3[5921]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 14:05:43 np0005595786.novalocal python3[5945]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 14:05:43 np0005595786.novalocal python3[5969]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 14:05:43 np0005595786.novalocal python3[5993]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 14:05:44 np0005595786.novalocal python3[6017]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 14:05:44 np0005595786.novalocal python3[6041]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 14:05:46 np0005595786.novalocal sudo[6065]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhkbzyiiesocpcrugcrnisnkxbljnpvq ; /usr/bin/python3'
Jan 26 14:05:46 np0005595786.novalocal sudo[6065]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:05:46 np0005595786.novalocal python3[6067]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 26 14:05:46 np0005595786.novalocal systemd[1]: Starting Time & Date Service...
Jan 26 14:05:46 np0005595786.novalocal systemd[1]: Started Time & Date Service.
Jan 26 14:05:47 np0005595786.novalocal systemd-timedated[6069]: Changed time zone to 'UTC' (UTC).
Jan 26 14:05:47 np0005595786.novalocal sudo[6065]: pam_unix(sudo:session): session closed for user root
Jan 26 14:05:47 np0005595786.novalocal sudo[6096]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwogtbldtpjgelwsvpxtygfsrnhrrmlc ; /usr/bin/python3'
Jan 26 14:05:47 np0005595786.novalocal sudo[6096]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:05:47 np0005595786.novalocal python3[6098]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:05:47 np0005595786.novalocal sudo[6096]: pam_unix(sudo:session): session closed for user root
Jan 26 14:05:47 np0005595786.novalocal python3[6174]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 14:05:48 np0005595786.novalocal python3[6245]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1769436347.617853-203-83294932976436/source _original_basename=tmp81vfjzkh follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:05:48 np0005595786.novalocal python3[6345]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 14:05:49 np0005595786.novalocal python3[6416]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769436348.5416067-243-15730291032362/source _original_basename=tmpjkcu3lsu follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:05:49 np0005595786.novalocal sudo[6516]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cungflkwrouvxitgcjwukctlfrplbvqv ; /usr/bin/python3'
Jan 26 14:05:49 np0005595786.novalocal sudo[6516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:05:50 np0005595786.novalocal python3[6518]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 14:05:50 np0005595786.novalocal sudo[6516]: pam_unix(sudo:session): session closed for user root
Jan 26 14:05:50 np0005595786.novalocal sudo[6589]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iyqphqksegmwerqmrzkbymbbisyinehz ; /usr/bin/python3'
Jan 26 14:05:50 np0005595786.novalocal sudo[6589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:05:50 np0005595786.novalocal python3[6591]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769436349.7338595-307-165150921202708/source _original_basename=tmpg1b2j75t follow=False checksum=f8e7a25c67610e75d05bab7943d515214e034b21 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:05:50 np0005595786.novalocal sudo[6589]: pam_unix(sudo:session): session closed for user root
Jan 26 14:05:51 np0005595786.novalocal python3[6639]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:05:51 np0005595786.novalocal python3[6665]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:05:51 np0005595786.novalocal sudo[6743]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tcclzsvbddiduykgxpelpjbuklvwlnbr ; /usr/bin/python3'
Jan 26 14:05:51 np0005595786.novalocal sudo[6743]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:05:51 np0005595786.novalocal python3[6745]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 14:05:51 np0005595786.novalocal sudo[6743]: pam_unix(sudo:session): session closed for user root
Jan 26 14:05:52 np0005595786.novalocal sudo[6816]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qgoqnthkymqpuyejkbmowhtsskwiehxk ; /usr/bin/python3'
Jan 26 14:05:52 np0005595786.novalocal sudo[6816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:05:52 np0005595786.novalocal python3[6818]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1769436351.4310732-363-17671007201336/source _original_basename=tmpd3sn40qh follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:05:52 np0005595786.novalocal sudo[6816]: pam_unix(sudo:session): session closed for user root
Jan 26 14:05:52 np0005595786.novalocal sudo[6867]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uoowncysvlidgawizuingldariyhdiik ; /usr/bin/python3'
Jan 26 14:05:52 np0005595786.novalocal sudo[6867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:05:52 np0005595786.novalocal python3[6869]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ec2-ffbe-a5d1-278f-00000000001e-1-compute1 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:05:52 np0005595786.novalocal sudo[6867]: pam_unix(sudo:session): session closed for user root
Jan 26 14:05:53 np0005595786.novalocal python3[6897]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163ec2-ffbe-a5d1-278f-00000000001f-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Jan 26 14:05:54 np0005595786.novalocal python3[6925]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:06:17 np0005595786.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 26 14:06:26 np0005595786.novalocal sudo[6951]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhrktaruxardigezbjbwfvnzbesljhtm ; /usr/bin/python3'
Jan 26 14:06:26 np0005595786.novalocal sudo[6951]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:06:27 np0005595786.novalocal python3[6953]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:06:27 np0005595786.novalocal sudo[6951]: pam_unix(sudo:session): session closed for user root
Jan 26 14:07:27 np0005595786.novalocal sshd-session[4319]: Received disconnect from 38.102.83.114 port 45856:11: disconnected by user
Jan 26 14:07:27 np0005595786.novalocal sshd-session[4319]: Disconnected from user zuul 38.102.83.114 port 45856
Jan 26 14:07:27 np0005595786.novalocal sshd-session[4305]: pam_unix(sshd:session): session closed for user zuul
Jan 26 14:07:27 np0005595786.novalocal systemd-logind[795]: Session 1 logged out. Waiting for processes to exit.
Jan 26 14:07:30 np0005595786.novalocal systemd[4309]: Starting Mark boot as successful...
Jan 26 14:07:30 np0005595786.novalocal systemd[4309]: Finished Mark boot as successful.
Jan 26 14:07:30 np0005595786.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Jan 26 14:07:30 np0005595786.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Jan 26 14:07:30 np0005595786.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Jan 26 14:07:30 np0005595786.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Jan 26 14:07:30 np0005595786.novalocal kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Jan 26 14:07:30 np0005595786.novalocal kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Jan 26 14:07:30 np0005595786.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Jan 26 14:07:30 np0005595786.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Jan 26 14:07:30 np0005595786.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Jan 26 14:07:30 np0005595786.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Jan 26 14:07:30 np0005595786.novalocal NetworkManager[860]: <info>  [1769436450.2833] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 26 14:07:30 np0005595786.novalocal systemd-udevd[6956]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 14:07:30 np0005595786.novalocal NetworkManager[860]: <info>  [1769436450.3021] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 14:07:30 np0005595786.novalocal NetworkManager[860]: <info>  [1769436450.3063] settings: (eth1): created default wired connection 'Wired connection 1'
Jan 26 14:07:30 np0005595786.novalocal NetworkManager[860]: <info>  [1769436450.3070] device (eth1): carrier: link connected
Jan 26 14:07:30 np0005595786.novalocal NetworkManager[860]: <info>  [1769436450.3074] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 26 14:07:30 np0005595786.novalocal NetworkManager[860]: <info>  [1769436450.3084] policy: auto-activating connection 'Wired connection 1' (6287cf3c-4986-3843-adcc-048b89f566c4)
Jan 26 14:07:30 np0005595786.novalocal NetworkManager[860]: <info>  [1769436450.3089] device (eth1): Activation: starting connection 'Wired connection 1' (6287cf3c-4986-3843-adcc-048b89f566c4)
Jan 26 14:07:30 np0005595786.novalocal NetworkManager[860]: <info>  [1769436450.3091] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 14:07:30 np0005595786.novalocal NetworkManager[860]: <info>  [1769436450.3097] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 14:07:30 np0005595786.novalocal NetworkManager[860]: <info>  [1769436450.3104] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 14:07:30 np0005595786.novalocal NetworkManager[860]: <info>  [1769436450.3110] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 26 14:07:30 np0005595786.novalocal sshd-session[6959]: Accepted publickey for zuul from 38.102.83.114 port 56390 ssh2: RSA SHA256:A4Dpo32LetI86PQcYPV26+sn2SDgPIozVnE9yyb/P6Q
Jan 26 14:07:31 np0005595786.novalocal systemd-logind[795]: New session 3 of user zuul.
Jan 26 14:07:31 np0005595786.novalocal systemd[1]: Started Session 3 of User zuul.
Jan 26 14:07:31 np0005595786.novalocal sshd-session[6959]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 14:07:31 np0005595786.novalocal python3[6986]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ec2-ffbe-2dd1-fb61-000000000173-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:07:38 np0005595786.novalocal sudo[7064]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlmjxttcpyhwxfbdqzbthfvlvqmvbpxu ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 26 14:07:38 np0005595786.novalocal sudo[7064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:07:38 np0005595786.novalocal python3[7066]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 14:07:38 np0005595786.novalocal sudo[7064]: pam_unix(sudo:session): session closed for user root
Jan 26 14:07:38 np0005595786.novalocal sudo[7137]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tevirqjaikpslpmxvaqtvyqaxjrwemyx ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 26 14:07:38 np0005595786.novalocal sudo[7137]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:07:38 np0005595786.novalocal python3[7139]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769436458.0051587-154-198051995607876/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=47de4a25c4cc3df2726a9fbfa2cf0842a6143526 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:07:38 np0005595786.novalocal sudo[7137]: pam_unix(sudo:session): session closed for user root
Jan 26 14:07:39 np0005595786.novalocal sudo[7187]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ligobmrrfafwtcmxxbcqpmxapezddcuk ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 26 14:07:39 np0005595786.novalocal sudo[7187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:07:39 np0005595786.novalocal python3[7189]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 14:07:39 np0005595786.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 26 14:07:39 np0005595786.novalocal systemd[1]: Stopped Network Manager Wait Online.
Jan 26 14:07:39 np0005595786.novalocal systemd[1]: Stopping Network Manager Wait Online...
Jan 26 14:07:39 np0005595786.novalocal systemd[1]: Stopping Network Manager...
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[860]: <info>  [1769436459.3918] caught SIGTERM, shutting down normally.
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[860]: <info>  [1769436459.3931] dhcp4 (eth0): canceled DHCP transaction
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[860]: <info>  [1769436459.3932] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[860]: <info>  [1769436459.3932] dhcp4 (eth0): state changed no lease
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[860]: <info>  [1769436459.3937] manager: NetworkManager state is now CONNECTING
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[860]: <info>  [1769436459.4044] dhcp4 (eth1): canceled DHCP transaction
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[860]: <info>  [1769436459.4044] dhcp4 (eth1): state changed no lease
Jan 26 14:07:39 np0005595786.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[860]: <info>  [1769436459.4109] exiting (success)
Jan 26 14:07:39 np0005595786.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 26 14:07:39 np0005595786.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 26 14:07:39 np0005595786.novalocal systemd[1]: Stopped Network Manager.
Jan 26 14:07:39 np0005595786.novalocal systemd[1]: NetworkManager.service: Consumed 1.326s CPU time, 9.9M memory peak.
Jan 26 14:07:39 np0005595786.novalocal systemd[1]: Starting Network Manager...
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436459.4744] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:ac2670c8-0eba-49ef-ba2e-d02b046debf0)
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436459.4746] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436459.4805] manager[0x5566f60da000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 26 14:07:39 np0005595786.novalocal systemd[1]: Starting Hostname Service...
Jan 26 14:07:39 np0005595786.novalocal systemd[1]: Started Hostname Service.
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436459.5759] hostname: hostname: using hostnamed
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436459.5760] hostname: static hostname changed from (none) to "np0005595786.novalocal"
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436459.5766] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436459.5771] manager[0x5566f60da000]: rfkill: Wi-Fi hardware radio set enabled
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436459.5771] manager[0x5566f60da000]: rfkill: WWAN hardware radio set enabled
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436459.5797] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436459.5798] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436459.5798] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436459.5798] manager: Networking is enabled by state file
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436459.5800] settings: Loaded settings plugin: keyfile (internal)
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436459.5804] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436459.5825] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436459.5833] dhcp: init: Using DHCP client 'internal'
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436459.5836] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436459.5842] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436459.5846] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436459.5851] device (lo): Activation: starting connection 'lo' (90cc952b-85d2-4ca7-a327-6d073fb6794e)
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436459.5856] device (eth0): carrier: link connected
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436459.5861] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436459.5864] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436459.5864] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436459.5868] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436459.5872] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436459.5876] device (eth1): carrier: link connected
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436459.5879] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436459.5882] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (6287cf3c-4986-3843-adcc-048b89f566c4) (indicated)
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436459.5882] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436459.5886] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436459.5892] device (eth1): Activation: starting connection 'Wired connection 1' (6287cf3c-4986-3843-adcc-048b89f566c4)
Jan 26 14:07:39 np0005595786.novalocal systemd[1]: Started Network Manager.
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436459.5899] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436459.5901] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436459.5903] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436459.5904] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436459.5905] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436459.5907] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436459.5908] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436459.5910] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436459.5911] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436459.5916] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436459.5918] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436459.5925] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436459.5927] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436459.5946] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436459.5947] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436459.5950] device (lo): Activation: successful, device activated.
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436459.5958] dhcp4 (eth0): state changed new lease, address=38.102.83.217
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436459.5963] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436459.6016] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436459.6031] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436459.6032] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436459.6035] manager: NetworkManager state is now CONNECTED_SITE
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436459.6037] device (eth0): Activation: successful, device activated.
Jan 26 14:07:39 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436459.6040] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 26 14:07:39 np0005595786.novalocal systemd[1]: Starting Network Manager Wait Online...
Jan 26 14:07:39 np0005595786.novalocal sudo[7187]: pam_unix(sudo:session): session closed for user root
Jan 26 14:07:39 np0005595786.novalocal python3[7273]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ec2-ffbe-2dd1-fb61-0000000000bd-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:07:49 np0005595786.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 26 14:08:09 np0005595786.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 26 14:08:25 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436505.3407] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 26 14:08:25 np0005595786.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 26 14:08:25 np0005595786.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 26 14:08:25 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436505.3809] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 26 14:08:25 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436505.3812] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 26 14:08:25 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436505.3823] device (eth1): Activation: successful, device activated.
Jan 26 14:08:25 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436505.3834] manager: startup complete
Jan 26 14:08:25 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436505.3836] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Jan 26 14:08:25 np0005595786.novalocal NetworkManager[7198]: <warn>  [1769436505.3844] device (eth1): Activation: failed for connection 'Wired connection 1'
Jan 26 14:08:25 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436505.3858] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Jan 26 14:08:25 np0005595786.novalocal systemd[1]: Finished Network Manager Wait Online.
Jan 26 14:08:25 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436505.3924] dhcp4 (eth1): canceled DHCP transaction
Jan 26 14:08:25 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436505.3924] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 26 14:08:25 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436505.3924] dhcp4 (eth1): state changed no lease
Jan 26 14:08:25 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436505.3946] policy: auto-activating connection 'ci-private-network' (1467aadd-b515-5a03-83b1-dc086af911e2)
Jan 26 14:08:25 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436505.3953] device (eth1): Activation: starting connection 'ci-private-network' (1467aadd-b515-5a03-83b1-dc086af911e2)
Jan 26 14:08:25 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436505.3955] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 14:08:25 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436505.3959] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 14:08:25 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436505.3970] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 14:08:25 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436505.3985] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 14:08:25 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436505.4033] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 14:08:25 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436505.4035] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 14:08:25 np0005595786.novalocal NetworkManager[7198]: <info>  [1769436505.4040] device (eth1): Activation: successful, device activated.
Jan 26 14:08:35 np0005595786.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 26 14:08:40 np0005595786.novalocal sshd-session[6962]: Received disconnect from 38.102.83.114 port 56390:11: disconnected by user
Jan 26 14:08:40 np0005595786.novalocal sshd-session[6962]: Disconnected from user zuul 38.102.83.114 port 56390
Jan 26 14:08:40 np0005595786.novalocal sshd-session[6959]: pam_unix(sshd:session): session closed for user zuul
Jan 26 14:08:40 np0005595786.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Jan 26 14:08:40 np0005595786.novalocal systemd[1]: session-3.scope: Consumed 1.796s CPU time.
Jan 26 14:08:40 np0005595786.novalocal systemd-logind[795]: Session 3 logged out. Waiting for processes to exit.
Jan 26 14:08:40 np0005595786.novalocal systemd-logind[795]: Removed session 3.
Jan 26 14:08:44 np0005595786.novalocal sshd-session[7301]: Accepted publickey for zuul from 38.102.83.114 port 36966 ssh2: RSA SHA256:A4Dpo32LetI86PQcYPV26+sn2SDgPIozVnE9yyb/P6Q
Jan 26 14:08:44 np0005595786.novalocal systemd-logind[795]: New session 4 of user zuul.
Jan 26 14:08:45 np0005595786.novalocal systemd[1]: Started Session 4 of User zuul.
Jan 26 14:08:45 np0005595786.novalocal sshd-session[7301]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 14:08:45 np0005595786.novalocal sudo[7380]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xuojgdmyvwsvaalhzbnwcnawafemfvey ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 26 14:08:45 np0005595786.novalocal sudo[7380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:08:45 np0005595786.novalocal python3[7382]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 14:08:45 np0005595786.novalocal sudo[7380]: pam_unix(sudo:session): session closed for user root
Jan 26 14:08:45 np0005595786.novalocal sudo[7453]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmczcjdvkrvpxjoknuceckklcrejnvtf ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 26 14:08:45 np0005595786.novalocal sudo[7453]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:08:45 np0005595786.novalocal python3[7455]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769436525.1311505-312-204472735476230/source _original_basename=tmps5q27iku follow=False checksum=5243b0bb251a97d1ca38a4ef84a61760fdc57991 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:08:45 np0005595786.novalocal sudo[7453]: pam_unix(sudo:session): session closed for user root
Jan 26 14:08:48 np0005595786.novalocal sshd-session[7304]: Connection closed by 38.102.83.114 port 36966
Jan 26 14:08:48 np0005595786.novalocal sshd-session[7301]: pam_unix(sshd:session): session closed for user zuul
Jan 26 14:08:48 np0005595786.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Jan 26 14:08:48 np0005595786.novalocal systemd-logind[795]: Session 4 logged out. Waiting for processes to exit.
Jan 26 14:08:48 np0005595786.novalocal systemd-logind[795]: Removed session 4.
Jan 26 14:10:28 np0005595786.novalocal systemd[4309]: Created slice User Background Tasks Slice.
Jan 26 14:10:28 np0005595786.novalocal systemd[4309]: Starting Cleanup of User's Temporary Files and Directories...
Jan 26 14:10:28 np0005595786.novalocal systemd[4309]: Finished Cleanup of User's Temporary Files and Directories.
Jan 26 14:14:38 np0005595786.novalocal sshd-session[7485]: Accepted publickey for zuul from 38.102.83.114 port 60212 ssh2: RSA SHA256:A4Dpo32LetI86PQcYPV26+sn2SDgPIozVnE9yyb/P6Q
Jan 26 14:14:38 np0005595786.novalocal systemd-logind[795]: New session 5 of user zuul.
Jan 26 14:14:38 np0005595786.novalocal systemd[1]: Started Session 5 of User zuul.
Jan 26 14:14:38 np0005595786.novalocal sshd-session[7485]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 14:14:38 np0005595786.novalocal sudo[7512]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-molxjnnaddxmngcvgueitqafwtcqsjhw ; /usr/bin/python3'
Jan 26 14:14:38 np0005595786.novalocal sudo[7512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:14:38 np0005595786.novalocal python3[7514]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163ec2-ffbe-5f7d-4578-00000000216c-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:14:38 np0005595786.novalocal sudo[7512]: pam_unix(sudo:session): session closed for user root
Jan 26 14:14:38 np0005595786.novalocal sudo[7540]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mshzlgapjcygmufndtatpyviorlcaoih ; /usr/bin/python3'
Jan 26 14:14:38 np0005595786.novalocal sudo[7540]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:14:38 np0005595786.novalocal python3[7542]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:14:38 np0005595786.novalocal sudo[7540]: pam_unix(sudo:session): session closed for user root
Jan 26 14:14:38 np0005595786.novalocal sudo[7567]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bewfepfagsihagqhbymeibybmuihfqmq ; /usr/bin/python3'
Jan 26 14:14:38 np0005595786.novalocal sudo[7567]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:14:38 np0005595786.novalocal python3[7569]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:14:38 np0005595786.novalocal sudo[7567]: pam_unix(sudo:session): session closed for user root
Jan 26 14:14:39 np0005595786.novalocal sudo[7593]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cddfcsepgzxdsjhwrehukmyhpqifqzlp ; /usr/bin/python3'
Jan 26 14:14:39 np0005595786.novalocal sudo[7593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:14:39 np0005595786.novalocal python3[7595]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:14:39 np0005595786.novalocal sudo[7593]: pam_unix(sudo:session): session closed for user root
Jan 26 14:14:39 np0005595786.novalocal sudo[7619]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zewbzlwpmvdkufkcqamoonlqrhhzjpol ; /usr/bin/python3'
Jan 26 14:14:39 np0005595786.novalocal sudo[7619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:14:39 np0005595786.novalocal python3[7621]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:14:39 np0005595786.novalocal sudo[7619]: pam_unix(sudo:session): session closed for user root
Jan 26 14:14:39 np0005595786.novalocal sudo[7645]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwaylygknbapmrrolminapgwnbxkulrt ; /usr/bin/python3'
Jan 26 14:14:39 np0005595786.novalocal sudo[7645]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:14:40 np0005595786.novalocal python3[7647]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:14:40 np0005595786.novalocal sudo[7645]: pam_unix(sudo:session): session closed for user root
Jan 26 14:14:40 np0005595786.novalocal sudo[7723]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wauqzdnxreuhrpwdjpjhpylmmsenuljq ; /usr/bin/python3'
Jan 26 14:14:40 np0005595786.novalocal sudo[7723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:14:40 np0005595786.novalocal python3[7725]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 14:14:40 np0005595786.novalocal sudo[7723]: pam_unix(sudo:session): session closed for user root
Jan 26 14:14:40 np0005595786.novalocal sudo[7796]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nyekpimekuyvlbpezrfoccqncqmflkui ; /usr/bin/python3'
Jan 26 14:14:40 np0005595786.novalocal sudo[7796]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:14:40 np0005595786.novalocal python3[7798]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769436880.2394316-514-178330731338536/source _original_basename=tmpekz5lpjc follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:14:41 np0005595786.novalocal sudo[7796]: pam_unix(sudo:session): session closed for user root
Jan 26 14:14:41 np0005595786.novalocal sudo[7846]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzmjtlywgndkdlxesjqtnbeqjjchysfd ; /usr/bin/python3'
Jan 26 14:14:41 np0005595786.novalocal sudo[7846]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:14:41 np0005595786.novalocal python3[7848]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 14:14:41 np0005595786.novalocal systemd[1]: Reloading.
Jan 26 14:14:41 np0005595786.novalocal systemd-rc-local-generator[7869]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:14:41 np0005595786.novalocal sudo[7846]: pam_unix(sudo:session): session closed for user root
Jan 26 14:14:43 np0005595786.novalocal sudo[7901]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-baviaxzcoqghfmavgefeozaoqkcxhtnm ; /usr/bin/python3'
Jan 26 14:14:43 np0005595786.novalocal sudo[7901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:14:43 np0005595786.novalocal python3[7903]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Jan 26 14:14:43 np0005595786.novalocal sudo[7901]: pam_unix(sudo:session): session closed for user root
Jan 26 14:14:43 np0005595786.novalocal sudo[7927]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-alzeyljytwoseqbpgsfzflplkgjqncdd ; /usr/bin/python3'
Jan 26 14:14:43 np0005595786.novalocal sudo[7927]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:14:43 np0005595786.novalocal python3[7929]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:14:43 np0005595786.novalocal sudo[7927]: pam_unix(sudo:session): session closed for user root
Jan 26 14:14:43 np0005595786.novalocal sudo[7955]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqzznrrhimdbfvlxsuadhbejucztgdvb ; /usr/bin/python3'
Jan 26 14:14:43 np0005595786.novalocal sudo[7955]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:14:43 np0005595786.novalocal python3[7957]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:14:43 np0005595786.novalocal sudo[7955]: pam_unix(sudo:session): session closed for user root
Jan 26 14:14:44 np0005595786.novalocal sudo[7983]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-igziettremxiolwiznzqqdjukxutypjc ; /usr/bin/python3'
Jan 26 14:14:44 np0005595786.novalocal sudo[7983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:14:44 np0005595786.novalocal python3[7985]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:14:44 np0005595786.novalocal sudo[7983]: pam_unix(sudo:session): session closed for user root
Jan 26 14:14:44 np0005595786.novalocal sudo[8011]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvlnwgdeassumwdvthlmvqmbbfgplvfx ; /usr/bin/python3'
Jan 26 14:14:44 np0005595786.novalocal sudo[8011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:14:44 np0005595786.novalocal python3[8013]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:14:44 np0005595786.novalocal sudo[8011]: pam_unix(sudo:session): session closed for user root
Jan 26 14:14:45 np0005595786.novalocal python3[8040]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163ec2-ffbe-5f7d-4578-000000002173-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:14:46 np0005595786.novalocal python3[8070]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Jan 26 14:14:48 np0005595786.novalocal sshd-session[7488]: Connection closed by 38.102.83.114 port 60212
Jan 26 14:14:48 np0005595786.novalocal sshd-session[7485]: pam_unix(sshd:session): session closed for user zuul
Jan 26 14:14:48 np0005595786.novalocal systemd-logind[795]: Session 5 logged out. Waiting for processes to exit.
Jan 26 14:14:48 np0005595786.novalocal systemd[1]: session-5.scope: Deactivated successfully.
Jan 26 14:14:48 np0005595786.novalocal systemd[1]: session-5.scope: Consumed 4.506s CPU time.
Jan 26 14:14:48 np0005595786.novalocal systemd-logind[795]: Removed session 5.
Jan 26 14:14:49 np0005595786.novalocal sshd-session[8075]: Accepted publickey for zuul from 38.102.83.114 port 54626 ssh2: RSA SHA256:A4Dpo32LetI86PQcYPV26+sn2SDgPIozVnE9yyb/P6Q
Jan 26 14:14:49 np0005595786.novalocal systemd-logind[795]: New session 6 of user zuul.
Jan 26 14:14:50 np0005595786.novalocal systemd[1]: Started Session 6 of User zuul.
Jan 26 14:14:50 np0005595786.novalocal sshd-session[8075]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 14:14:50 np0005595786.novalocal sudo[8102]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwehebdphdgwlpwuhlngffvmgkxyhrri ; /usr/bin/python3'
Jan 26 14:14:50 np0005595786.novalocal sudo[8102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:14:50 np0005595786.novalocal python3[8104]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Jan 26 14:14:56 np0005595786.novalocal setsebool[8142]: The virt_use_nfs policy boolean was changed to 1 by root
Jan 26 14:14:56 np0005595786.novalocal setsebool[8142]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Jan 26 14:15:06 np0005595786.novalocal kernel: SELinux:  Converting 385 SID table entries...
Jan 26 14:15:06 np0005595786.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Jan 26 14:15:06 np0005595786.novalocal kernel: SELinux:  policy capability open_perms=1
Jan 26 14:15:06 np0005595786.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Jan 26 14:15:06 np0005595786.novalocal kernel: SELinux:  policy capability always_check_network=0
Jan 26 14:15:06 np0005595786.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 26 14:15:06 np0005595786.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 26 14:15:06 np0005595786.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 26 14:15:16 np0005595786.novalocal kernel: SELinux:  Converting 388 SID table entries...
Jan 26 14:15:16 np0005595786.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Jan 26 14:15:16 np0005595786.novalocal kernel: SELinux:  policy capability open_perms=1
Jan 26 14:15:16 np0005595786.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Jan 26 14:15:16 np0005595786.novalocal kernel: SELinux:  policy capability always_check_network=0
Jan 26 14:15:16 np0005595786.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 26 14:15:16 np0005595786.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 26 14:15:16 np0005595786.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 26 14:15:34 np0005595786.novalocal dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 26 14:15:34 np0005595786.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 26 14:15:34 np0005595786.novalocal systemd[1]: Starting man-db-cache-update.service...
Jan 26 14:15:34 np0005595786.novalocal systemd[1]: Reloading.
Jan 26 14:15:34 np0005595786.novalocal systemd-rc-local-generator[8911]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:15:34 np0005595786.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Jan 26 14:15:36 np0005595786.novalocal sudo[8102]: pam_unix(sudo:session): session closed for user root
Jan 26 14:15:43 np0005595786.novalocal python3[14452]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"
                                                        _uses_shell=True zuul_log_id=fa163ec2-ffbe-b842-7e67-00000000000b-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:15:44 np0005595786.novalocal kernel: evm: overlay not supported
Jan 26 14:15:44 np0005595786.novalocal systemd[4309]: Starting D-Bus User Message Bus...
Jan 26 14:15:44 np0005595786.novalocal dbus-broker-launch[14895]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Jan 26 14:15:44 np0005595786.novalocal dbus-broker-launch[14895]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Jan 26 14:15:44 np0005595786.novalocal systemd[4309]: Started D-Bus User Message Bus.
Jan 26 14:15:44 np0005595786.novalocal dbus-broker-lau[14895]: Ready
Jan 26 14:15:44 np0005595786.novalocal systemd[4309]: selinux: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 26 14:15:44 np0005595786.novalocal systemd[4309]: Created slice Slice /user.
Jan 26 14:15:44 np0005595786.novalocal systemd[4309]: podman-14828.scope: unit configures an IP firewall, but not running as root.
Jan 26 14:15:44 np0005595786.novalocal systemd[4309]: (This warning is only shown for the first unit using IP firewalling.)
Jan 26 14:15:44 np0005595786.novalocal systemd[4309]: Started podman-14828.scope.
Jan 26 14:15:45 np0005595786.novalocal systemd[4309]: Started podman-pause-bb2b692c.scope.
Jan 26 14:15:45 np0005595786.novalocal sudo[15186]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjqinxhempuggrmoeurnfbnflrztgxwi ; /usr/bin/python3'
Jan 26 14:15:45 np0005595786.novalocal sudo[15186]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:15:45 np0005595786.novalocal python3[15198]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]
                                                       location = "38.102.83.230:5001"
                                                       insecure = true path=/etc/containers/registries.conf block=[[registry]]
                                                       location = "38.102.83.230:5001"
                                                       insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:15:45 np0005595786.novalocal python3[15198]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Jan 26 14:15:45 np0005595786.novalocal sudo[15186]: pam_unix(sudo:session): session closed for user root
Jan 26 14:15:46 np0005595786.novalocal sshd-session[8078]: Connection closed by 38.102.83.114 port 54626
Jan 26 14:15:46 np0005595786.novalocal sshd-session[8075]: pam_unix(sshd:session): session closed for user zuul
Jan 26 14:15:46 np0005595786.novalocal systemd[1]: session-6.scope: Deactivated successfully.
Jan 26 14:15:46 np0005595786.novalocal systemd[1]: session-6.scope: Consumed 42.347s CPU time.
Jan 26 14:15:46 np0005595786.novalocal systemd-logind[795]: Session 6 logged out. Waiting for processes to exit.
Jan 26 14:15:46 np0005595786.novalocal systemd-logind[795]: Removed session 6.
Jan 26 14:15:54 np0005595786.novalocal irqbalance[790]: Cannot change IRQ 27 affinity: Operation not permitted
Jan 26 14:15:54 np0005595786.novalocal irqbalance[790]: IRQ 27 affinity is now unmanaged
Jan 26 14:16:07 np0005595786.novalocal sshd-session[23149]: Connection closed by 38.102.83.94 port 36724 [preauth]
Jan 26 14:16:07 np0005595786.novalocal sshd-session[23154]: Connection closed by 38.102.83.94 port 36726 [preauth]
Jan 26 14:16:07 np0005595786.novalocal sshd-session[23155]: Unable to negotiate with 38.102.83.94 port 36738: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Jan 26 14:16:07 np0005595786.novalocal sshd-session[23152]: Unable to negotiate with 38.102.83.94 port 36748: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Jan 26 14:16:07 np0005595786.novalocal sshd-session[23156]: Unable to negotiate with 38.102.83.94 port 36730: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Jan 26 14:16:11 np0005595786.novalocal sshd-session[24755]: Accepted publickey for zuul from 38.102.83.114 port 45374 ssh2: RSA SHA256:A4Dpo32LetI86PQcYPV26+sn2SDgPIozVnE9yyb/P6Q
Jan 26 14:16:11 np0005595786.novalocal systemd-logind[795]: New session 7 of user zuul.
Jan 26 14:16:11 np0005595786.novalocal systemd[1]: Started Session 7 of User zuul.
Jan 26 14:16:11 np0005595786.novalocal sshd-session[24755]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 14:16:12 np0005595786.novalocal python3[24854]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHoZtRQWBwoOjEsRPW4OJizwKraQ7AQFCZLo0UZN5mtCyj3Sh9waHKJHxRJ1vr+FUdBFHG050HUi/WrhzvDLRh4= zuul@np0005595784.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 14:16:12 np0005595786.novalocal sudo[25049]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-csiiclmxuyjvgflpasnwvuquasncuxuh ; /usr/bin/python3'
Jan 26 14:16:12 np0005595786.novalocal sudo[25049]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:16:12 np0005595786.novalocal python3[25058]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHoZtRQWBwoOjEsRPW4OJizwKraQ7AQFCZLo0UZN5mtCyj3Sh9waHKJHxRJ1vr+FUdBFHG050HUi/WrhzvDLRh4= zuul@np0005595784.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 14:16:12 np0005595786.novalocal sudo[25049]: pam_unix(sudo:session): session closed for user root
Jan 26 14:16:13 np0005595786.novalocal sudo[25414]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygzrmqrgrjsicpexutpmgurebsanftay ; /usr/bin/python3'
Jan 26 14:16:13 np0005595786.novalocal sudo[25414]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:16:13 np0005595786.novalocal python3[25424]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005595786.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Jan 26 14:16:13 np0005595786.novalocal useradd[25493]: new group: name=cloud-admin, GID=1002
Jan 26 14:16:13 np0005595786.novalocal useradd[25493]: new user: name=cloud-admin, UID=1002, GID=1002, home=/home/cloud-admin, shell=/bin/bash, from=none
Jan 26 14:16:13 np0005595786.novalocal sudo[25414]: pam_unix(sudo:session): session closed for user root
Jan 26 14:16:13 np0005595786.novalocal sudo[25617]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sggmmqslfddwitmmpguuwrecbtmcnrba ; /usr/bin/python3'
Jan 26 14:16:13 np0005595786.novalocal sudo[25617]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:16:13 np0005595786.novalocal python3[25629]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHoZtRQWBwoOjEsRPW4OJizwKraQ7AQFCZLo0UZN5mtCyj3Sh9waHKJHxRJ1vr+FUdBFHG050HUi/WrhzvDLRh4= zuul@np0005595784.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 14:16:14 np0005595786.novalocal sudo[25617]: pam_unix(sudo:session): session closed for user root
Jan 26 14:16:14 np0005595786.novalocal sudo[25913]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mscvuxljejvdqrpedbxcwzdnqlxogqfr ; /usr/bin/python3'
Jan 26 14:16:14 np0005595786.novalocal sudo[25913]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:16:14 np0005595786.novalocal python3[25923]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 14:16:14 np0005595786.novalocal sudo[25913]: pam_unix(sudo:session): session closed for user root
Jan 26 14:16:14 np0005595786.novalocal sudo[26210]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mccybfdypqqowvgrxhvvhjgahvixdnie ; /usr/bin/python3'
Jan 26 14:16:14 np0005595786.novalocal sudo[26210]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:16:14 np0005595786.novalocal python3[26219]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769436974.14999-152-72529257249592/source _original_basename=tmpnn8ib_4i follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:16:14 np0005595786.novalocal sudo[26210]: pam_unix(sudo:session): session closed for user root
Jan 26 14:16:15 np0005595786.novalocal sudo[26569]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdujlzszziwhqklilslnqrcnuqhkhytq ; /usr/bin/python3'
Jan 26 14:16:15 np0005595786.novalocal sudo[26569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:16:15 np0005595786.novalocal python3[26578]: ansible-ansible.builtin.hostname Invoked with name=compute-1 use=systemd
Jan 26 14:16:15 np0005595786.novalocal systemd[1]: Starting Hostname Service...
Jan 26 14:16:15 np0005595786.novalocal systemd[1]: Started Hostname Service.
Jan 26 14:16:15 np0005595786.novalocal systemd-hostnamed[26685]: Changed pretty hostname to 'compute-1'
Jan 26 14:16:15 compute-1 systemd-hostnamed[26685]: Hostname set to <compute-1> (static)
Jan 26 14:16:15 compute-1 NetworkManager[7198]: <info>  [1769436975.9768] hostname: static hostname changed from "np0005595786.novalocal" to "compute-1"
Jan 26 14:16:15 compute-1 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 26 14:16:16 compute-1 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 26 14:16:16 compute-1 sudo[26569]: pam_unix(sudo:session): session closed for user root
Jan 26 14:16:16 compute-1 sshd-session[24799]: Connection closed by 38.102.83.114 port 45374
Jan 26 14:16:16 compute-1 sshd-session[24755]: pam_unix(sshd:session): session closed for user zuul
Jan 26 14:16:16 compute-1 systemd[1]: session-7.scope: Deactivated successfully.
Jan 26 14:16:16 compute-1 systemd[1]: session-7.scope: Consumed 2.386s CPU time.
Jan 26 14:16:16 compute-1 systemd-logind[795]: Session 7 logged out. Waiting for processes to exit.
Jan 26 14:16:16 compute-1 systemd-logind[795]: Removed session 7.
Jan 26 14:16:26 compute-1 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 26 14:16:28 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 26 14:16:28 compute-1 systemd[1]: Finished man-db-cache-update.service.
Jan 26 14:16:28 compute-1 systemd[1]: man-db-cache-update.service: Consumed 1min 1.941s CPU time.
Jan 26 14:16:28 compute-1 systemd[1]: run-r53fa1a8f40744942b4cad7ebfc245c2e.service: Deactivated successfully.
Jan 26 14:16:46 compute-1 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 26 14:20:18 compute-1 systemd[1]: Starting Cleanup of Temporary Directories...
Jan 26 14:20:18 compute-1 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Jan 26 14:20:18 compute-1 systemd[1]: Finished Cleanup of Temporary Directories.
Jan 26 14:20:18 compute-1 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Jan 26 14:20:51 compute-1 sshd-session[29923]: Accepted publickey for zuul from 38.102.83.94 port 46076 ssh2: RSA SHA256:A4Dpo32LetI86PQcYPV26+sn2SDgPIozVnE9yyb/P6Q
Jan 26 14:20:51 compute-1 systemd-logind[795]: New session 8 of user zuul.
Jan 26 14:20:51 compute-1 systemd[1]: Started Session 8 of User zuul.
Jan 26 14:20:51 compute-1 sshd-session[29923]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 14:20:51 compute-1 python3[29999]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 14:20:53 compute-1 sudo[30113]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ysqjvmwpoouvaeyjxgcydnrwwuhtmulm ; /usr/bin/python3'
Jan 26 14:20:53 compute-1 sudo[30113]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:20:53 compute-1 python3[30115]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 14:20:53 compute-1 sudo[30113]: pam_unix(sudo:session): session closed for user root
Jan 26 14:20:53 compute-1 sudo[30186]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpqmqkmvvwreewhrocmmbixpwicvrtqc ; /usr/bin/python3'
Jan 26 14:20:53 compute-1 sudo[30186]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:20:54 compute-1 python3[30188]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769437253.3449848-33805-260231814285589/source mode=0755 _original_basename=delorean.repo follow=False checksum=2e65f5781089f6db35f20eae2311859479a007a0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:20:54 compute-1 sudo[30186]: pam_unix(sudo:session): session closed for user root
Jan 26 14:20:54 compute-1 sudo[30212]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sonxnqjwwmxrtluwuzrjakndcfixpqky ; /usr/bin/python3'
Jan 26 14:20:54 compute-1 sudo[30212]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:20:54 compute-1 python3[30214]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-master-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 14:20:54 compute-1 sudo[30212]: pam_unix(sudo:session): session closed for user root
Jan 26 14:20:54 compute-1 sudo[30285]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmjwplrkkrqdgxbwbwfpxluuamxxjeam ; /usr/bin/python3'
Jan 26 14:20:54 compute-1 sudo[30285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:20:54 compute-1 python3[30287]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769437253.3449848-33805-260231814285589/source mode=0755 _original_basename=delorean-master-testing.repo follow=False checksum=2c5ad31b3cd5c5b96a9995d83e342833f9bd7020 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:20:54 compute-1 sudo[30285]: pam_unix(sudo:session): session closed for user root
Jan 26 14:20:54 compute-1 sudo[30311]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujixrqzmcvvdilihgmpridsvnbdmameh ; /usr/bin/python3'
Jan 26 14:20:54 compute-1 sudo[30311]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:20:55 compute-1 python3[30313]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 14:20:55 compute-1 sudo[30311]: pam_unix(sudo:session): session closed for user root
Jan 26 14:20:55 compute-1 sudo[30384]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-buwawmrvqexeuskgrtlnyixvgwecutlg ; /usr/bin/python3'
Jan 26 14:20:55 compute-1 sudo[30384]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:20:55 compute-1 python3[30386]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769437253.3449848-33805-260231814285589/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:20:55 compute-1 sudo[30384]: pam_unix(sudo:session): session closed for user root
Jan 26 14:20:55 compute-1 sudo[30410]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bximncxaoygurolnodmuyiehyjpdbrtq ; /usr/bin/python3'
Jan 26 14:20:55 compute-1 sudo[30410]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:20:55 compute-1 python3[30412]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 14:20:55 compute-1 sudo[30410]: pam_unix(sudo:session): session closed for user root
Jan 26 14:20:56 compute-1 sudo[30483]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkanstzhlemfcbtmhjblnunjmmhabohw ; /usr/bin/python3'
Jan 26 14:20:56 compute-1 sudo[30483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:20:56 compute-1 python3[30485]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769437253.3449848-33805-260231814285589/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:20:56 compute-1 sudo[30483]: pam_unix(sudo:session): session closed for user root
Jan 26 14:20:56 compute-1 sudo[30509]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlnbwuynoewmxcvawkeeotvhcgnpukoq ; /usr/bin/python3'
Jan 26 14:20:56 compute-1 sudo[30509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:20:56 compute-1 python3[30511]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 14:20:56 compute-1 sudo[30509]: pam_unix(sudo:session): session closed for user root
Jan 26 14:20:56 compute-1 sudo[30582]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xovugadjtsomskcgisgovawylaqfqtjx ; /usr/bin/python3'
Jan 26 14:20:56 compute-1 sudo[30582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:20:57 compute-1 python3[30584]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769437253.3449848-33805-260231814285589/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:20:57 compute-1 sudo[30582]: pam_unix(sudo:session): session closed for user root
Jan 26 14:20:57 compute-1 sudo[30608]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrbkjyurqvjwjqeowlnctcsnwzeqhlsv ; /usr/bin/python3'
Jan 26 14:20:57 compute-1 sudo[30608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:20:57 compute-1 python3[30610]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 14:20:57 compute-1 sudo[30608]: pam_unix(sudo:session): session closed for user root
Jan 26 14:20:57 compute-1 sudo[30681]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xentbbdjsxzycduxduuuhamildzdmdda ; /usr/bin/python3'
Jan 26 14:20:57 compute-1 sudo[30681]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:20:57 compute-1 python3[30683]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769437253.3449848-33805-260231814285589/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:20:57 compute-1 sudo[30681]: pam_unix(sudo:session): session closed for user root
Jan 26 14:20:57 compute-1 sudo[30707]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wlhofefrxyclkffpgmthhfhrwlaitdrp ; /usr/bin/python3'
Jan 26 14:20:57 compute-1 sudo[30707]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:20:58 compute-1 python3[30709]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 14:20:58 compute-1 sudo[30707]: pam_unix(sudo:session): session closed for user root
Jan 26 14:20:58 compute-1 sudo[30780]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqsemxpdfpbekalwdlyaffriorxoqrqb ; /usr/bin/python3'
Jan 26 14:20:58 compute-1 sudo[30780]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:20:58 compute-1 python3[30782]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769437253.3449848-33805-260231814285589/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=aa03f96b62b2a238943efcc5a547883c212e7d56 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:20:58 compute-1 sudo[30780]: pam_unix(sudo:session): session closed for user root
Jan 26 14:20:58 compute-1 sudo[30806]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-feokjtbdhimdsgdzgkcicfxqlakblvrq ; /usr/bin/python3'
Jan 26 14:20:58 compute-1 sudo[30806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:20:58 compute-1 python3[30808]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/gating.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 14:20:58 compute-1 sudo[30806]: pam_unix(sudo:session): session closed for user root
Jan 26 14:20:59 compute-1 sudo[30879]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhlgaisiqwqlwuqzezkudgfnjfluzviw ; /usr/bin/python3'
Jan 26 14:20:59 compute-1 sudo[30879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:20:59 compute-1 python3[30881]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769437253.3449848-33805-260231814285589/source mode=0755 _original_basename=gating.repo follow=False checksum=8663a6ac2146a5b004c16369975d5c569c970566 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:20:59 compute-1 sudo[30879]: pam_unix(sudo:session): session closed for user root
Jan 26 14:21:52 compute-1 python3[30930]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:22:14 compute-1 sshd-session[30932]: banner exchange: Connection from 158.94.208.52 port 48056: invalid format
Jan 26 14:26:52 compute-1 sshd-session[29926]: Received disconnect from 38.102.83.94 port 46076:11: disconnected by user
Jan 26 14:26:52 compute-1 sshd-session[29926]: Disconnected from user zuul 38.102.83.94 port 46076
Jan 26 14:26:52 compute-1 sshd-session[29923]: pam_unix(sshd:session): session closed for user zuul
Jan 26 14:26:52 compute-1 systemd[1]: session-8.scope: Deactivated successfully.
Jan 26 14:26:52 compute-1 systemd[1]: session-8.scope: Consumed 6.874s CPU time.
Jan 26 14:26:52 compute-1 systemd-logind[795]: Session 8 logged out. Waiting for processes to exit.
Jan 26 14:26:52 compute-1 systemd-logind[795]: Removed session 8.
Jan 26 14:29:03 compute-1 sshd-session[30935]: Connection reset by authenticating user root 176.120.22.13 port 30504 [preauth]
Jan 26 14:29:05 compute-1 sshd-session[30937]: Invalid user admin from 176.120.22.13 port 60780
Jan 26 14:29:05 compute-1 sshd-session[30937]: Connection reset by invalid user admin 176.120.22.13 port 60780 [preauth]
Jan 26 14:29:08 compute-1 sshd-session[30939]: Connection reset by authenticating user root 176.120.22.13 port 60792 [preauth]
Jan 26 14:29:11 compute-1 sshd-session[30941]: Connection reset by authenticating user root 176.120.22.13 port 60804 [preauth]
Jan 26 14:29:13 compute-1 sshd-session[30943]: Invalid user  from 176.120.22.13 port 27450
Jan 26 14:29:13 compute-1 sshd-session[30943]: Connection reset by invalid user  176.120.22.13 port 27450 [preauth]
Jan 26 14:31:59 compute-1 sshd-session[30946]: Invalid user 0 from 185.246.128.170 port 20566
Jan 26 14:31:59 compute-1 sshd-session[30946]: Disconnecting invalid user 0 185.246.128.170 port 20566: Change of username or service not allowed: (0,ssh-connection) -> (aaa,ssh-connection) [preauth]
Jan 26 14:32:05 compute-1 sshd-session[30948]: Invalid user aaa from 185.246.128.170 port 5275
Jan 26 14:32:06 compute-1 sshd-session[30948]: Disconnecting invalid user aaa 185.246.128.170 port 5275: Change of username or service not allowed: (aaa,ssh-connection) -> (morteza,ssh-connection) [preauth]
Jan 26 14:32:19 compute-1 sshd-session[30950]: Invalid user morteza from 185.246.128.170 port 31519
Jan 26 14:32:21 compute-1 sshd-session[30950]: Disconnecting invalid user morteza 185.246.128.170 port 31519: Change of username or service not allowed: (morteza,ssh-connection) -> (support1,ssh-connection) [preauth]
Jan 26 14:32:34 compute-1 sshd-session[30952]: Invalid user support1 from 185.246.128.170 port 42625
Jan 26 14:32:36 compute-1 sshd-session[30952]: Disconnecting invalid user support1 185.246.128.170 port 42625: Change of username or service not allowed: (support1,ssh-connection) -> (wade,ssh-connection) [preauth]
Jan 26 14:32:39 compute-1 sshd-session[30954]: Invalid user wade from 185.246.128.170 port 40140
Jan 26 14:32:39 compute-1 sshd-session[30954]: Disconnecting invalid user wade 185.246.128.170 port 40140: Change of username or service not allowed: (wade,ssh-connection) -> (array,ssh-connection) [preauth]
Jan 26 14:32:52 compute-1 sshd-session[30957]: Invalid user array from 185.246.128.170 port 22603
Jan 26 14:32:53 compute-1 sshd-session[30957]: Disconnecting invalid user array 185.246.128.170 port 22603: Change of username or service not allowed: (array,ssh-connection) -> (kali,ssh-connection) [preauth]
Jan 26 14:33:05 compute-1 sshd-session[30959]: Invalid user kali from 185.246.128.170 port 51270
Jan 26 14:33:05 compute-1 sshd-session[30959]: Disconnecting invalid user kali 185.246.128.170 port 51270: Change of username or service not allowed: (kali,ssh-connection) -> (jrodrig,ssh-connection) [preauth]
Jan 26 14:33:14 compute-1 sshd-session[30961]: Invalid user jrodrig from 185.246.128.170 port 61462
Jan 26 14:33:15 compute-1 sshd-session[30961]: Disconnecting invalid user jrodrig 185.246.128.170 port 61462: Change of username or service not allowed: (jrodrig,ssh-connection) -> (dspace,ssh-connection) [preauth]
Jan 26 14:33:16 compute-1 sshd-session[30963]: Invalid user dspace from 185.246.128.170 port 14066
Jan 26 14:33:17 compute-1 sshd-session[30963]: Disconnecting invalid user dspace 185.246.128.170 port 14066: Change of username or service not allowed: (dspace,ssh-connection) -> (hacluster,ssh-connection) [preauth]
Jan 26 14:33:25 compute-1 sshd-session[30965]: Invalid user hacluster from 185.246.128.170 port 8074
Jan 26 14:33:27 compute-1 sshd-session[30965]: Disconnecting invalid user hacluster 185.246.128.170 port 8074: Change of username or service not allowed: (hacluster,ssh-connection) -> (dbadmin,ssh-connection) [preauth]
Jan 26 14:33:36 compute-1 sshd-session[30967]: Invalid user dbadmin from 185.246.128.170 port 52864
Jan 26 14:33:36 compute-1 sshd-session[30967]: Disconnecting invalid user dbadmin 185.246.128.170 port 52864: Change of username or service not allowed: (dbadmin,ssh-connection) -> (demo,ssh-connection) [preauth]
Jan 26 14:33:51 compute-1 sshd-session[30969]: Invalid user demo from 185.246.128.170 port 23616
Jan 26 14:33:52 compute-1 sshd-session[30969]: Disconnecting invalid user demo 185.246.128.170 port 23616: Change of username or service not allowed: (demo,ssh-connection) -> (ftpusr,ssh-connection) [preauth]
Jan 26 14:34:00 compute-1 sshd-session[30971]: Invalid user ftpusr from 185.246.128.170 port 44490
Jan 26 14:34:01 compute-1 sshd-session[30971]: Disconnecting invalid user ftpusr 185.246.128.170 port 44490: Change of username or service not allowed: (ftpusr,ssh-connection) -> (,ssh-connection) [preauth]
Jan 26 14:34:09 compute-1 sshd-session[30973]: Invalid user  from 185.246.128.170 port 16347
Jan 26 14:34:11 compute-1 sshd-session[30975]: Accepted publickey for zuul from 192.168.122.30 port 49096 ssh2: ECDSA SHA256:jdvz2lyr5EyIYu+bvBKw53Euf9HIejup115smDZmKeU
Jan 26 14:34:11 compute-1 systemd-logind[795]: New session 9 of user zuul.
Jan 26 14:34:11 compute-1 systemd[1]: Started Session 9 of User zuul.
Jan 26 14:34:11 compute-1 sshd-session[30975]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 14:34:13 compute-1 python3.9[31128]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 14:34:13 compute-1 sshd-session[30973]: Disconnecting invalid user  185.246.128.170 port 16347: Change of username or service not allowed: (,ssh-connection) -> (splunk,ssh-connection) [preauth]
Jan 26 14:34:14 compute-1 sudo[31307]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vdwsmhuxfbcpyskirxrisfrlslvtgsnl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438053.7548268-40-100457297764299/AnsiballZ_command.py'
Jan 26 14:34:14 compute-1 sudo[31307]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:34:14 compute-1 python3.9[31309]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:34:21 compute-1 sshd-session[31324]: Invalid user splunk from 185.246.128.170 port 60678
Jan 26 14:34:23 compute-1 sshd-session[31324]: Disconnecting invalid user splunk 185.246.128.170 port 60678: Change of username or service not allowed: (splunk,ssh-connection) -> (operator,ssh-connection) [preauth]
Jan 26 14:34:26 compute-1 sudo[31307]: pam_unix(sudo:session): session closed for user root
Jan 26 14:34:26 compute-1 sshd-session[30978]: Connection closed by 192.168.122.30 port 49096
Jan 26 14:34:26 compute-1 sshd-session[30975]: pam_unix(sshd:session): session closed for user zuul
Jan 26 14:34:26 compute-1 systemd[1]: session-9.scope: Deactivated successfully.
Jan 26 14:34:26 compute-1 systemd[1]: session-9.scope: Consumed 8.320s CPU time.
Jan 26 14:34:26 compute-1 systemd-logind[795]: Session 9 logged out. Waiting for processes to exit.
Jan 26 14:34:26 compute-1 systemd-logind[795]: Removed session 9.
Jan 26 14:34:31 compute-1 sshd-session[31373]: Accepted publickey for zuul from 192.168.122.30 port 44262 ssh2: ECDSA SHA256:jdvz2lyr5EyIYu+bvBKw53Euf9HIejup115smDZmKeU
Jan 26 14:34:31 compute-1 systemd-logind[795]: New session 10 of user zuul.
Jan 26 14:34:31 compute-1 systemd[1]: Started Session 10 of User zuul.
Jan 26 14:34:31 compute-1 sshd-session[31373]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 14:34:32 compute-1 sshd-session[31346]: Disconnecting authenticating user operator 185.246.128.170 port 48673: Change of username or service not allowed: (operator,ssh-connection) -> (es2,ssh-connection) [preauth]
Jan 26 14:34:32 compute-1 python3.9[31526]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 14:34:33 compute-1 sshd-session[31376]: Connection closed by 192.168.122.30 port 44262
Jan 26 14:34:33 compute-1 sshd-session[31373]: pam_unix(sshd:session): session closed for user zuul
Jan 26 14:34:33 compute-1 systemd[1]: session-10.scope: Deactivated successfully.
Jan 26 14:34:33 compute-1 systemd-logind[795]: Session 10 logged out. Waiting for processes to exit.
Jan 26 14:34:33 compute-1 systemd-logind[795]: Removed session 10.
Jan 26 14:34:51 compute-1 sshd-session[31556]: Accepted publickey for zuul from 192.168.122.30 port 51628 ssh2: ECDSA SHA256:jdvz2lyr5EyIYu+bvBKw53Euf9HIejup115smDZmKeU
Jan 26 14:34:51 compute-1 systemd-logind[795]: New session 11 of user zuul.
Jan 26 14:34:51 compute-1 systemd[1]: Started Session 11 of User zuul.
Jan 26 14:34:51 compute-1 sshd-session[31556]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 14:34:51 compute-1 sshd-session[31554]: Invalid user es2 from 185.246.128.170 port 49893
Jan 26 14:34:52 compute-1 sshd-session[31554]: Disconnecting invalid user es2 185.246.128.170 port 49893: Change of username or service not allowed: (es2,ssh-connection) -> (nc,ssh-connection) [preauth]
Jan 26 14:34:52 compute-1 python3.9[31709]: ansible-ansible.legacy.ping Invoked with data=pong
Jan 26 14:34:53 compute-1 python3.9[31883]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 14:34:54 compute-1 sudo[32034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktnihlnvzucjnimqseimfpfdibjsqwer ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438093.7541726-65-102772049348699/AnsiballZ_command.py'
Jan 26 14:34:54 compute-1 sudo[32034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:34:54 compute-1 python3.9[32036]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:34:54 compute-1 sudo[32034]: pam_unix(sudo:session): session closed for user root
Jan 26 14:34:55 compute-1 sudo[32187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svrcqmvzsojziaumwucrujjdrzjlgxoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438094.7934906-89-12620846195210/AnsiballZ_stat.py'
Jan 26 14:34:55 compute-1 sudo[32187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:34:55 compute-1 python3.9[32189]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 14:34:55 compute-1 sudo[32187]: pam_unix(sudo:session): session closed for user root
Jan 26 14:34:56 compute-1 sudo[32339]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfpenzqyxbhsydprchnwzcciprysiwdk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438095.6947367-105-77455933059043/AnsiballZ_file.py'
Jan 26 14:34:56 compute-1 sudo[32339]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:34:56 compute-1 python3.9[32341]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:34:56 compute-1 sudo[32339]: pam_unix(sudo:session): session closed for user root
Jan 26 14:34:56 compute-1 sudo[32491]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdhntycprtptsfsglepjclkseesrsemd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438096.514045-121-32516788187589/AnsiballZ_stat.py'
Jan 26 14:34:56 compute-1 sudo[32491]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:34:57 compute-1 python3.9[32493]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:34:57 compute-1 sudo[32491]: pam_unix(sudo:session): session closed for user root
Jan 26 14:34:57 compute-1 sudo[32614]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzceunlaeidvlmoatljxstwggopdhcjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438096.514045-121-32516788187589/AnsiballZ_copy.py'
Jan 26 14:34:57 compute-1 sudo[32614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:34:57 compute-1 python3.9[32616]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769438096.514045-121-32516788187589/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:34:57 compute-1 sudo[32614]: pam_unix(sudo:session): session closed for user root
Jan 26 14:34:58 compute-1 sudo[32766]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-krqvhkumqyovaionhenxklsafjzweabu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438098.0157266-151-199125113865772/AnsiballZ_setup.py'
Jan 26 14:34:58 compute-1 sudo[32766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:34:58 compute-1 python3.9[32768]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 14:34:58 compute-1 sudo[32766]: pam_unix(sudo:session): session closed for user root
Jan 26 14:34:59 compute-1 sudo[32923]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqcbrrgowjqawuojisuqmsmspnrowode ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438099.008781-167-125956898965656/AnsiballZ_file.py'
Jan 26 14:34:59 compute-1 sudo[32923]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:34:59 compute-1 python3.9[32925]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:34:59 compute-1 sudo[32923]: pam_unix(sudo:session): session closed for user root
Jan 26 14:35:00 compute-1 sudo[33075]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-niziixxrmezogapxmuyezjkngiynofqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438099.8123615-185-30986451064593/AnsiballZ_file.py'
Jan 26 14:35:00 compute-1 sudo[33075]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:35:00 compute-1 python3.9[33077]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:35:00 compute-1 sudo[33075]: pam_unix(sudo:session): session closed for user root
Jan 26 14:35:01 compute-1 python3.9[33227]: ansible-ansible.builtin.service_facts Invoked
Jan 26 14:35:04 compute-1 python3.9[33480]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:35:05 compute-1 python3.9[33630]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 14:35:06 compute-1 sshd-session[31983]: Invalid user nc from 185.246.128.170 port 56900
Jan 26 14:35:06 compute-1 sshd-session[31983]: Disconnecting invalid user nc 185.246.128.170 port 56900: Change of username or service not allowed: (nc,ssh-connection) -> (dqi,ssh-connection) [preauth]
Jan 26 14:35:07 compute-1 python3.9[33784]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 14:35:07 compute-1 sudo[33940]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwlmzxabcbxnlsobadhwjlagonzrurlu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438107.5423725-281-40427391517868/AnsiballZ_setup.py'
Jan 26 14:35:07 compute-1 sudo[33940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:35:08 compute-1 python3.9[33942]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 14:35:08 compute-1 sudo[33940]: pam_unix(sudo:session): session closed for user root
Jan 26 14:35:08 compute-1 sudo[34024]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbyqfdnqpzomjoirbypqqafcyiuktngl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438107.5423725-281-40427391517868/AnsiballZ_dnf.py'
Jan 26 14:35:08 compute-1 sudo[34024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:35:09 compute-1 python3.9[34026]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 14:35:15 compute-1 sshd-session[34058]: Invalid user dqi from 185.246.128.170 port 61346
Jan 26 14:35:15 compute-1 sshd-session[34058]: Disconnecting invalid user dqi 185.246.128.170 port 61346: Change of username or service not allowed: (dqi,ssh-connection) -> (ts,ssh-connection) [preauth]
Jan 26 14:35:23 compute-1 sshd-session[34098]: Invalid user ts from 185.246.128.170 port 5425
Jan 26 14:35:25 compute-1 sshd-session[34098]: Disconnecting invalid user ts 185.246.128.170 port 5425: Change of username or service not allowed: (ts,ssh-connection) -> (loginuser,ssh-connection) [preauth]
Jan 26 14:35:35 compute-1 sshd-session[34147]: Invalid user loginuser from 185.246.128.170 port 36419
Jan 26 14:35:37 compute-1 sshd-session[34147]: Disconnecting invalid user loginuser 185.246.128.170 port 36419: Change of username or service not allowed: (loginuser,ssh-connection) -> (lixiang,ssh-connection) [preauth]
Jan 26 14:35:45 compute-1 sshd-session[34178]: Invalid user lixiang from 185.246.128.170 port 7142
Jan 26 14:35:45 compute-1 sshd-session[34178]: Disconnecting invalid user lixiang 185.246.128.170 port 7142: Change of username or service not allowed: (lixiang,ssh-connection) -> (default,ssh-connection) [preauth]
Jan 26 14:35:48 compute-1 sshd-session[34180]: Invalid user default from 185.246.128.170 port 44611
Jan 26 14:35:55 compute-1 sshd-session[34180]: Disconnecting invalid user default 185.246.128.170 port 44611: Change of username or service not allowed: (default,ssh-connection) -> (manish,ssh-connection) [preauth]
Jan 26 14:35:56 compute-1 systemd[1]: Reloading.
Jan 26 14:35:56 compute-1 systemd-rc-local-generator[34231]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:35:56 compute-1 systemd[1]: Starting dnf makecache...
Jan 26 14:35:56 compute-1 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Jan 26 14:35:56 compute-1 dnf[34246]: Repository 'gating-repo' is missing name in configuration, using id.
Jan 26 14:35:57 compute-1 dnf[34246]: Failed determining last makecache time.
Jan 26 14:35:57 compute-1 dnf[34246]: delorean-openstack-barbican-42b4c41831408a8e323 163 kB/s | 3.0 kB     00:00
Jan 26 14:35:57 compute-1 dnf[34246]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 200 kB/s | 3.0 kB     00:00
Jan 26 14:35:57 compute-1 dnf[34246]: delorean-openstack-cinder-1c00d6490d88e436f26ef 156 kB/s | 3.0 kB     00:00
Jan 26 14:35:57 compute-1 dnf[34246]: delorean-python-stevedore-c4acc5639fd2329372142 159 kB/s | 3.0 kB     00:00
Jan 26 14:35:57 compute-1 dnf[34246]: delorean-python-cloudkitty-tests-tempest-2c80f8 198 kB/s | 3.0 kB     00:00
Jan 26 14:35:57 compute-1 dnf[34246]: delorean-os-refresh-config-9bfc52b5049be2d8de61 194 kB/s | 3.0 kB     00:00
Jan 26 14:35:57 compute-1 dnf[34246]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 206 kB/s | 3.0 kB     00:00
Jan 26 14:35:57 compute-1 dnf[34246]: delorean-python-designate-tests-tempest-347fdbc 185 kB/s | 3.0 kB     00:00
Jan 26 14:35:57 compute-1 dnf[34246]: delorean-openstack-glance-1fd12c29b339f30fe823e 190 kB/s | 3.0 kB     00:00
Jan 26 14:35:57 compute-1 dnf[34246]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 180 kB/s | 3.0 kB     00:00
Jan 26 14:35:57 compute-1 dnf[34246]: delorean-openstack-manila-3c01b7181572c95dac462 169 kB/s | 3.0 kB     00:00
Jan 26 14:35:57 compute-1 dnf[34246]: delorean-python-whitebox-neutron-tests-tempest- 178 kB/s | 3.0 kB     00:00
Jan 26 14:35:57 compute-1 dnf[34246]: delorean-openstack-octavia-ba397f07a7331190208c 186 kB/s | 3.0 kB     00:00
Jan 26 14:35:57 compute-1 dnf[34246]: delorean-openstack-watcher-c014f81a8647287f6dcc 175 kB/s | 3.0 kB     00:00
Jan 26 14:35:57 compute-1 dnf[34246]: delorean-ansible-config_template-5ccaa22121a7ff 186 kB/s | 3.0 kB     00:00
Jan 26 14:35:57 compute-1 dnf[34246]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 167 kB/s | 3.0 kB     00:00
Jan 26 14:35:57 compute-1 dnf[34246]: delorean-openstack-swift-dc98a8463506ac520c469a 174 kB/s | 3.0 kB     00:00
Jan 26 14:35:57 compute-1 dnf[34246]: delorean-python-tempestconf-8515371b7cceebd4282 195 kB/s | 3.0 kB     00:00
Jan 26 14:35:57 compute-1 dnf[34246]: delorean-openstack-heat-ui-013accbfd179753bc3f0 196 kB/s | 3.0 kB     00:00
Jan 26 14:35:57 compute-1 dnf[34246]: gating-repo                                     254 kB/s | 1.5 kB     00:00
Jan 26 14:35:57 compute-1 dnf[34246]: CentOS Stream 9 - BaseOS                         69 kB/s | 6.7 kB     00:00
Jan 26 14:35:57 compute-1 systemd[1]: Reloading.
Jan 26 14:35:57 compute-1 systemd-rc-local-generator[34301]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:35:57 compute-1 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Jan 26 14:35:57 compute-1 dnf[34246]: CentOS Stream 9 - AppStream                      15 kB/s | 6.8 kB     00:00
Jan 26 14:35:58 compute-1 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Jan 26 14:35:58 compute-1 systemd[1]: Reloading.
Jan 26 14:35:58 compute-1 systemd-rc-local-generator[34342]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:35:58 compute-1 dnf[34246]: CentOS Stream 9 - CRB                            60 kB/s | 6.6 kB     00:00
Jan 26 14:35:58 compute-1 systemd[1]: Listening on LVM2 poll daemon socket.
Jan 26 14:35:58 compute-1 dnf[34246]: CentOS Stream 9 - Extras packages                69 kB/s | 7.3 kB     00:00
Jan 26 14:35:58 compute-1 dnf[34246]: dlrn-antelope-testing                            99 kB/s | 3.0 kB     00:00
Jan 26 14:35:58 compute-1 dnf[34246]: dlrn-antelope-build-deps                        103 kB/s | 3.0 kB     00:00
Jan 26 14:35:58 compute-1 dnf[34246]: centos9-rabbitmq                                113 kB/s | 3.0 kB     00:00
Jan 26 14:35:58 compute-1 dnf[34246]: centos9-storage                                 114 kB/s | 3.0 kB     00:00
Jan 26 14:35:58 compute-1 dnf[34246]: centos9-opstools                                101 kB/s | 3.0 kB     00:00
Jan 26 14:35:58 compute-1 dnf[34246]: NFV SIG OpenvSwitch                             104 kB/s | 3.0 kB     00:00
Jan 26 14:35:58 compute-1 dbus-broker-launch[756]: Noticed file-system modification, trigger reload.
Jan 26 14:35:58 compute-1 dbus-broker-launch[756]: Noticed file-system modification, trigger reload.
Jan 26 14:35:58 compute-1 dbus-broker-launch[756]: Noticed file-system modification, trigger reload.
Jan 26 14:35:58 compute-1 dnf[34246]: repo-setup-centos-appstream                      17 kB/s | 4.4 kB     00:00
Jan 26 14:35:59 compute-1 dnf[34246]: repo-setup-centos-baseos                         12 kB/s | 3.9 kB     00:00
Jan 26 14:35:59 compute-1 dnf[34246]: repo-setup-centos-highavailability              134 kB/s | 3.9 kB     00:00
Jan 26 14:35:59 compute-1 dnf[34246]: repo-setup-centos-powertools                     77 kB/s | 4.3 kB     00:00
Jan 26 14:35:59 compute-1 dnf[34246]: Extra Packages for Enterprise Linux 9 - x86_64  166 kB/s |  32 kB     00:00
Jan 26 14:36:00 compute-1 dnf[34246]: Metadata cache created.
Jan 26 14:36:00 compute-1 systemd[1]: dnf-makecache.service: Deactivated successfully.
Jan 26 14:36:00 compute-1 systemd[1]: Finished dnf makecache.
Jan 26 14:36:00 compute-1 systemd[1]: dnf-makecache.service: Consumed 1.969s CPU time.
Jan 26 14:36:09 compute-1 sshd-session[34389]: Invalid user manish from 185.246.128.170 port 3796
Jan 26 14:36:12 compute-1 sshd-session[34389]: Disconnecting invalid user manish 185.246.128.170 port 3796: Change of username or service not allowed: (manish,ssh-connection) -> (Test,ssh-connection) [preauth]
Jan 26 14:36:19 compute-1 sshd-session[34419]: Invalid user Test from 185.246.128.170 port 13138
Jan 26 14:36:22 compute-1 sshd-session[34419]: Disconnecting invalid user Test 185.246.128.170 port 13138: Change of username or service not allowed: (Test,ssh-connection) -> (cristi,ssh-connection) [preauth]
Jan 26 14:36:29 compute-1 sshd-session[34469]: Invalid user cristi from 185.246.128.170 port 33740
Jan 26 14:36:33 compute-1 sshd-session[34469]: Disconnecting invalid user cristi 185.246.128.170 port 33740: Change of username or service not allowed: (cristi,ssh-connection) -> (openmediavault,ssh-connection [preauth]
Jan 26 14:36:40 compute-1 sshd-session[34525]: Invalid user openmediavault from 185.246.128.170 port 26148
Jan 26 14:36:40 compute-1 sshd-session[34525]: Disconnecting invalid user openmediavault 185.246.128.170 port 26148: Change of username or service not allowed: (openmediavault,ssh-connection) -> (vtiger,ssh-connection [preauth]
Jan 26 14:36:53 compute-1 sshd-session[34549]: Invalid user vtiger from 185.246.128.170 port 10657
Jan 26 14:36:56 compute-1 sshd-session[34549]: Disconnecting invalid user vtiger 185.246.128.170 port 10657: Change of username or service not allowed: (vtiger,ssh-connection) -> (intern,ssh-connection) [preauth]
Jan 26 14:37:08 compute-1 sshd-session[34563]: Invalid user intern from 185.246.128.170 port 18270
Jan 26 14:37:12 compute-1 sshd-session[34563]: Disconnecting invalid user intern 185.246.128.170 port 18270: Change of username or service not allowed: (intern,ssh-connection) -> (postgres,ssh-connection) [preauth]
Jan 26 14:37:12 compute-1 sshd[1009]: Timeout before authentication for connection from 220.195.3.197 to 38.102.83.217, pid = 34074
Jan 26 14:37:13 compute-1 kernel: SELinux:  Converting 2724 SID table entries...
Jan 26 14:37:13 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Jan 26 14:37:13 compute-1 kernel: SELinux:  policy capability open_perms=1
Jan 26 14:37:13 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Jan 26 14:37:13 compute-1 kernel: SELinux:  policy capability always_check_network=0
Jan 26 14:37:13 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 26 14:37:13 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 26 14:37:13 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 26 14:37:13 compute-1 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Jan 26 14:37:14 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 26 14:37:14 compute-1 systemd[1]: Starting man-db-cache-update.service...
Jan 26 14:37:14 compute-1 systemd[1]: Reloading.
Jan 26 14:37:14 compute-1 systemd-rc-local-generator[34714]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:37:14 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 26 14:37:15 compute-1 sudo[34024]: pam_unix(sudo:session): session closed for user root
Jan 26 14:37:15 compute-1 sudo[35625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzxkuvhevezdtzqmgbrupzabrwguxecm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438235.4579322-305-170156319328684/AnsiballZ_command.py'
Jan 26 14:37:15 compute-1 sudo[35625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:37:15 compute-1 python3.9[35627]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:37:16 compute-1 sudo[35625]: pam_unix(sudo:session): session closed for user root
Jan 26 14:37:17 compute-1 sudo[35906]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzdolahbekrskbsdybvowxbhrhuixgpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438237.103848-321-16023015656878/AnsiballZ_selinux.py'
Jan 26 14:37:17 compute-1 sudo[35906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:37:18 compute-1 python3.9[35908]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Jan 26 14:37:18 compute-1 sudo[35906]: pam_unix(sudo:session): session closed for user root
Jan 26 14:37:18 compute-1 sudo[36059]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lsdpfvpbwpqkujrogkcntdudvvtbuart ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438238.494804-343-275653733924840/AnsiballZ_command.py'
Jan 26 14:37:18 compute-1 sudo[36059]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:37:18 compute-1 python3.9[36061]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Jan 26 14:37:19 compute-1 sudo[36059]: pam_unix(sudo:session): session closed for user root
Jan 26 14:37:20 compute-1 sudo[36213]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hsdlfypbsxqfzyufmldyzvxdvewinvoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438240.3374383-359-145860549002317/AnsiballZ_file.py'
Jan 26 14:37:20 compute-1 sudo[36213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:37:23 compute-1 python3.9[36215]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:37:23 compute-1 sudo[36213]: pam_unix(sudo:session): session closed for user root
Jan 26 14:37:24 compute-1 sudo[36365]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-elzmabtjdolpuxibbyoexfaqxbellolt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438243.5525045-376-221613307497096/AnsiballZ_mount.py'
Jan 26 14:37:24 compute-1 sudo[36365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:37:24 compute-1 sshd-session[34748]: Invalid user postgres from 185.246.128.170 port 37415
Jan 26 14:37:24 compute-1 python3.9[36367]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Jan 26 14:37:24 compute-1 sudo[36365]: pam_unix(sudo:session): session closed for user root
Jan 26 14:37:25 compute-1 sudo[36517]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxdfilsmpbiyyvwtvgiahrdysstqvieb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438245.5352044-431-205216967986928/AnsiballZ_file.py'
Jan 26 14:37:25 compute-1 sudo[36517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:37:29 compute-1 sshd-session[34748]: Disconnecting invalid user postgres 185.246.128.170 port 37415: Change of username or service not allowed: (postgres,ssh-connection) -> (user3,ssh-connection) [preauth]
Jan 26 14:37:30 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 26 14:37:30 compute-1 systemd[1]: Finished man-db-cache-update.service.
Jan 26 14:37:30 compute-1 systemd[1]: man-db-cache-update.service: Consumed 1.181s CPU time.
Jan 26 14:37:30 compute-1 systemd[1]: run-r6955afe3220b451c9d015d7ac62faa4b.service: Deactivated successfully.
Jan 26 14:37:30 compute-1 python3.9[36519]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:37:30 compute-1 sudo[36517]: pam_unix(sudo:session): session closed for user root
Jan 26 14:37:30 compute-1 sudo[36671]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yofegkecdobahcnujcewfrmmlwgcimml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438250.685674-447-175342722602782/AnsiballZ_stat.py'
Jan 26 14:37:30 compute-1 sudo[36671]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:37:31 compute-1 python3.9[36673]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:37:31 compute-1 sudo[36671]: pam_unix(sudo:session): session closed for user root
Jan 26 14:37:31 compute-1 sudo[36794]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ddizdjnovmhhgpmteixrhxmlbszbmjfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438250.685674-447-175342722602782/AnsiballZ_copy.py'
Jan 26 14:37:31 compute-1 sudo[36794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:37:31 compute-1 python3.9[36796]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769438250.685674-447-175342722602782/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=539478296eac4798b42b7e16b49efacc0999af66 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:37:31 compute-1 sudo[36794]: pam_unix(sudo:session): session closed for user root
Jan 26 14:37:35 compute-1 sudo[36946]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkvocnpzhqdmuzvvhoupijqpxktbwham ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438255.1157165-495-262505606437779/AnsiballZ_stat.py'
Jan 26 14:37:35 compute-1 sudo[36946]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:37:35 compute-1 python3.9[36948]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 14:37:35 compute-1 sudo[36946]: pam_unix(sudo:session): session closed for user root
Jan 26 14:37:36 compute-1 sudo[37098]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gldbplkifnqbahfzrxngjiatqeiebaxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438255.977192-511-233516613608334/AnsiballZ_command.py'
Jan 26 14:37:36 compute-1 sudo[37098]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:37:36 compute-1 python3.9[37100]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:37:36 compute-1 sudo[37098]: pam_unix(sudo:session): session closed for user root
Jan 26 14:37:37 compute-1 sudo[37251]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nedmcsruxjrigctkcfkfapshyjznwchd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438256.8441536-527-65596780668993/AnsiballZ_file.py'
Jan 26 14:37:37 compute-1 sudo[37251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:37:37 compute-1 python3.9[37253]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:37:37 compute-1 sudo[37251]: pam_unix(sudo:session): session closed for user root
Jan 26 14:37:38 compute-1 sudo[37405]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqkzxuoleoyjlxxkbqtnkpojoihrpzwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438257.830456-549-262702357877361/AnsiballZ_getent.py'
Jan 26 14:37:38 compute-1 sudo[37405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:37:38 compute-1 python3.9[37407]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Jan 26 14:37:38 compute-1 sudo[37405]: pam_unix(sudo:session): session closed for user root
Jan 26 14:37:38 compute-1 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 14:37:38 compute-1 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 14:37:39 compute-1 sudo[37559]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cicirzftiyhzfxpgmjwiuvuoqwxjbzaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438258.7429385-565-25564572448322/AnsiballZ_group.py'
Jan 26 14:37:39 compute-1 sudo[37559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:37:39 compute-1 python3.9[37561]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 26 14:37:39 compute-1 groupadd[37562]: group added to /etc/group: name=qemu, GID=107
Jan 26 14:37:39 compute-1 groupadd[37562]: group added to /etc/gshadow: name=qemu
Jan 26 14:37:39 compute-1 groupadd[37562]: new group: name=qemu, GID=107
Jan 26 14:37:39 compute-1 sudo[37559]: pam_unix(sudo:session): session closed for user root
Jan 26 14:37:40 compute-1 sudo[37717]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjyoogovejtkwidqjkkzihumdlwqylwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438260.42832-581-230268838570873/AnsiballZ_user.py'
Jan 26 14:37:40 compute-1 sudo[37717]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:37:41 compute-1 python3.9[37719]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 26 14:37:41 compute-1 useradd[37721]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=/dev/pts/0
Jan 26 14:37:41 compute-1 sudo[37717]: pam_unix(sudo:session): session closed for user root
Jan 26 14:37:42 compute-1 sudo[37877]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-refjxztzknapfpahdpzstguqrnwzspjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438261.7063894-597-45248382225950/AnsiballZ_getent.py'
Jan 26 14:37:42 compute-1 sudo[37877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:37:42 compute-1 python3.9[37879]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Jan 26 14:37:42 compute-1 sudo[37877]: pam_unix(sudo:session): session closed for user root
Jan 26 14:37:42 compute-1 sshd-session[37254]: Invalid user user3 from 185.246.128.170 port 49633
Jan 26 14:37:42 compute-1 sudo[38030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndgdekimbvlzfmxyrjonfwwjoljcdgyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438262.5274053-613-98649854195561/AnsiballZ_group.py'
Jan 26 14:37:42 compute-1 sudo[38030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:37:43 compute-1 python3.9[38032]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 26 14:37:43 compute-1 groupadd[38033]: group added to /etc/group: name=hugetlbfs, GID=42477
Jan 26 14:37:43 compute-1 groupadd[38033]: group added to /etc/gshadow: name=hugetlbfs
Jan 26 14:37:43 compute-1 groupadd[38033]: new group: name=hugetlbfs, GID=42477
Jan 26 14:37:43 compute-1 sudo[38030]: pam_unix(sudo:session): session closed for user root
Jan 26 14:37:43 compute-1 sudo[38188]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttjcwamnmuqbtwovzswwnlbhzfbbryuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438263.4428675-631-106171056507094/AnsiballZ_file.py'
Jan 26 14:37:43 compute-1 sudo[38188]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:37:44 compute-1 python3.9[38190]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Jan 26 14:37:44 compute-1 sudo[38188]: pam_unix(sudo:session): session closed for user root
Jan 26 14:37:44 compute-1 sudo[38340]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-opflstpfzachforfxeijfnhwtsdrpess ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438264.4190469-653-128659352288607/AnsiballZ_dnf.py'
Jan 26 14:37:44 compute-1 sudo[38340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:37:44 compute-1 python3.9[38342]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 14:37:46 compute-1 sudo[38340]: pam_unix(sudo:session): session closed for user root
Jan 26 14:37:47 compute-1 sudo[38493]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djxusuavgzsqhwlenvppoqwwthnrxeza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438266.9920394-669-167540096011308/AnsiballZ_file.py'
Jan 26 14:37:47 compute-1 sudo[38493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:37:47 compute-1 python3.9[38495]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:37:47 compute-1 sudo[38493]: pam_unix(sudo:session): session closed for user root
Jan 26 14:37:48 compute-1 sudo[38645]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cznahhstqlptziztxxschcglagygoyja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438267.7515273-685-262595116244024/AnsiballZ_stat.py'
Jan 26 14:37:48 compute-1 sudo[38645]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:37:48 compute-1 python3.9[38647]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:37:48 compute-1 sudo[38645]: pam_unix(sudo:session): session closed for user root
Jan 26 14:37:48 compute-1 sudo[38768]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxfdttyxpiegmlhzhkslxifjubpkfaau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438267.7515273-685-262595116244024/AnsiballZ_copy.py'
Jan 26 14:37:48 compute-1 sudo[38768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:37:48 compute-1 python3.9[38770]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769438267.7515273-685-262595116244024/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:37:48 compute-1 sudo[38768]: pam_unix(sudo:session): session closed for user root
Jan 26 14:37:49 compute-1 sudo[38920]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-czxhyfeunyjsmtcjfjhavodbkahrylbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438269.096908-715-132591956150167/AnsiballZ_systemd.py'
Jan 26 14:37:49 compute-1 sudo[38920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:37:50 compute-1 python3.9[38922]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 14:37:50 compute-1 systemd[1]: Starting Load Kernel Modules...
Jan 26 14:37:50 compute-1 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Jan 26 14:37:50 compute-1 kernel: Bridge firewalling registered
Jan 26 14:37:50 compute-1 systemd-modules-load[38926]: Inserted module 'br_netfilter'
Jan 26 14:37:50 compute-1 systemd[1]: Finished Load Kernel Modules.
Jan 26 14:37:50 compute-1 sudo[38920]: pam_unix(sudo:session): session closed for user root
Jan 26 14:37:50 compute-1 sudo[39079]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttyqwcwnqfeapdhatviircestnjgrmuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438270.4851046-731-277000885779037/AnsiballZ_stat.py'
Jan 26 14:37:50 compute-1 sudo[39079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:37:51 compute-1 python3.9[39081]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:37:51 compute-1 sudo[39079]: pam_unix(sudo:session): session closed for user root
Jan 26 14:37:51 compute-1 sudo[39202]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqnyowmbraxipfghfmejhsqrxyjbsuoy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438270.4851046-731-277000885779037/AnsiballZ_copy.py'
Jan 26 14:37:51 compute-1 sudo[39202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:37:51 compute-1 python3.9[39204]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769438270.4851046-731-277000885779037/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:37:51 compute-1 sudo[39202]: pam_unix(sudo:session): session closed for user root
Jan 26 14:37:52 compute-1 sudo[39354]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcunduufupceslkikaabbsjfrctoilct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438272.2067347-767-244238805500011/AnsiballZ_dnf.py'
Jan 26 14:37:52 compute-1 sudo[39354]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:37:52 compute-1 python3.9[39356]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 14:37:52 compute-1 sshd-session[37254]: Disconnecting invalid user user3 185.246.128.170 port 49633: Change of username or service not allowed: (user3,ssh-connection) -> (test1,ssh-connection) [preauth]
Jan 26 14:37:56 compute-1 dbus-broker-launch[756]: Noticed file-system modification, trigger reload.
Jan 26 14:37:56 compute-1 dbus-broker-launch[756]: Noticed file-system modification, trigger reload.
Jan 26 14:37:57 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 26 14:37:57 compute-1 systemd[1]: Starting man-db-cache-update.service...
Jan 26 14:37:57 compute-1 systemd[1]: Reloading.
Jan 26 14:37:57 compute-1 systemd-rc-local-generator[39416]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:37:57 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 26 14:38:00 compute-1 sudo[39354]: pam_unix(sudo:session): session closed for user root
Jan 26 14:38:01 compute-1 python3.9[42720]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 14:38:02 compute-1 python3.9[43224]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Jan 26 14:38:03 compute-1 sshd-session[42596]: Invalid user test1 from 185.246.128.170 port 52807
Jan 26 14:38:03 compute-1 python3.9[43374]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 14:38:04 compute-1 sudo[43524]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxuolekylhptqofuzhvumzjhvvolkahd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438284.0487432-845-179890764580862/AnsiballZ_command.py'
Jan 26 14:38:04 compute-1 sudo[43524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:38:04 compute-1 python3.9[43526]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:38:04 compute-1 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 26 14:38:05 compute-1 systemd[1]: Starting Authorization Manager...
Jan 26 14:38:05 compute-1 polkitd[43743]: Started polkitd version 0.117
Jan 26 14:38:05 compute-1 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 26 14:38:05 compute-1 polkitd[43743]: Loading rules from directory /etc/polkit-1/rules.d
Jan 26 14:38:05 compute-1 polkitd[43743]: Loading rules from directory /usr/share/polkit-1/rules.d
Jan 26 14:38:05 compute-1 polkitd[43743]: Finished loading, compiling and executing 2 rules
Jan 26 14:38:05 compute-1 systemd[1]: Started Authorization Manager.
Jan 26 14:38:05 compute-1 polkitd[43743]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Jan 26 14:38:05 compute-1 sudo[43524]: pam_unix(sudo:session): session closed for user root
Jan 26 14:38:06 compute-1 sudo[43911]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbnvjxpgepbdwhhfaccirmwmrzlbfddy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438285.8227928-863-249213073285414/AnsiballZ_systemd.py'
Jan 26 14:38:06 compute-1 sudo[43911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:38:06 compute-1 python3.9[43913]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 14:38:06 compute-1 systemd[1]: Stopping Dynamic System Tuning Daemon...
Jan 26 14:38:06 compute-1 systemd[1]: tuned.service: Deactivated successfully.
Jan 26 14:38:06 compute-1 systemd[1]: Stopped Dynamic System Tuning Daemon.
Jan 26 14:38:06 compute-1 sshd-session[42596]: Disconnecting invalid user test1 185.246.128.170 port 52807: Change of username or service not allowed: (test1,ssh-connection) -> (rahul,ssh-connection) [preauth]
Jan 26 14:38:06 compute-1 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 26 14:38:07 compute-1 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 26 14:38:07 compute-1 sudo[43911]: pam_unix(sudo:session): session closed for user root
Jan 26 14:38:07 compute-1 python3.9[44074]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Jan 26 14:38:09 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 26 14:38:09 compute-1 systemd[1]: Finished man-db-cache-update.service.
Jan 26 14:38:09 compute-1 systemd[1]: man-db-cache-update.service: Consumed 6.086s CPU time.
Jan 26 14:38:09 compute-1 systemd[1]: run-r7beed97845b8474188a5139de22482d9.service: Deactivated successfully.
Jan 26 14:38:10 compute-1 sudo[44225]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgvwocarepazpjidjakkgytlumuwbhug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438290.386705-977-97180371620887/AnsiballZ_systemd.py'
Jan 26 14:38:10 compute-1 sudo[44225]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:38:11 compute-1 python3.9[44227]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 14:38:11 compute-1 systemd[1]: Reloading.
Jan 26 14:38:11 compute-1 systemd-rc-local-generator[44257]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:38:11 compute-1 sudo[44225]: pam_unix(sudo:session): session closed for user root
Jan 26 14:38:11 compute-1 sudo[44415]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnpmwbweblnjyilkukvbkypvzxkpptqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438291.6415162-977-227124180457076/AnsiballZ_systemd.py'
Jan 26 14:38:11 compute-1 sudo[44415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:38:12 compute-1 python3.9[44417]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 14:38:12 compute-1 systemd[1]: Reloading.
Jan 26 14:38:12 compute-1 systemd-rc-local-generator[44449]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:38:12 compute-1 sudo[44415]: pam_unix(sudo:session): session closed for user root
Jan 26 14:38:13 compute-1 sudo[44605]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adgfvgethsblljtktgqzgdywljwdotvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438293.0805519-1009-144636830465506/AnsiballZ_command.py'
Jan 26 14:38:13 compute-1 sudo[44605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:38:13 compute-1 python3.9[44607]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:38:13 compute-1 sudo[44605]: pam_unix(sudo:session): session closed for user root
Jan 26 14:38:14 compute-1 sudo[44758]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxncxbnichbplodmtfoyqmypnqwfnkdj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438293.9812508-1025-59609587773938/AnsiballZ_command.py'
Jan 26 14:38:14 compute-1 sudo[44758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:38:14 compute-1 python3.9[44760]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:38:14 compute-1 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Jan 26 14:38:14 compute-1 sudo[44758]: pam_unix(sudo:session): session closed for user root
Jan 26 14:38:15 compute-1 sudo[44911]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zuayrjmmnjayiblvwhmzratzfmkidqad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438294.9154234-1041-201517821494068/AnsiballZ_command.py'
Jan 26 14:38:15 compute-1 sudo[44911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:38:15 compute-1 python3.9[44913]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:38:16 compute-1 sshd-session[44265]: Invalid user rahul from 185.246.128.170 port 28089
Jan 26 14:38:16 compute-1 sudo[44911]: pam_unix(sudo:session): session closed for user root
Jan 26 14:38:17 compute-1 sshd-session[44265]: Disconnecting invalid user rahul 185.246.128.170 port 28089: Change of username or service not allowed: (rahul,ssh-connection) -> (btf,ssh-connection) [preauth]
Jan 26 14:38:17 compute-1 sudo[45073]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gysxroalruececfwjxwdkjtuwybbzhvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438297.0932078-1057-216839564855190/AnsiballZ_command.py'
Jan 26 14:38:17 compute-1 sudo[45073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:38:17 compute-1 python3.9[45075]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:38:17 compute-1 sudo[45073]: pam_unix(sudo:session): session closed for user root
Jan 26 14:38:18 compute-1 sudo[45226]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkflxjelihmdjtngfycsfiamrskdmyzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438298.0332537-1073-94461643904032/AnsiballZ_systemd.py'
Jan 26 14:38:18 compute-1 sudo[45226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:38:18 compute-1 python3.9[45228]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 14:38:18 compute-1 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 26 14:38:18 compute-1 systemd[1]: Stopped Apply Kernel Variables.
Jan 26 14:38:18 compute-1 systemd[1]: Stopping Apply Kernel Variables...
Jan 26 14:38:18 compute-1 systemd[1]: Starting Apply Kernel Variables...
Jan 26 14:38:18 compute-1 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Jan 26 14:38:18 compute-1 systemd[1]: Finished Apply Kernel Variables.
Jan 26 14:38:18 compute-1 sudo[45226]: pam_unix(sudo:session): session closed for user root
Jan 26 14:38:19 compute-1 sshd-session[31559]: Connection closed by 192.168.122.30 port 51628
Jan 26 14:38:19 compute-1 sshd-session[31556]: pam_unix(sshd:session): session closed for user zuul
Jan 26 14:38:19 compute-1 systemd-logind[795]: Session 11 logged out. Waiting for processes to exit.
Jan 26 14:38:19 compute-1 systemd[1]: session-11.scope: Deactivated successfully.
Jan 26 14:38:19 compute-1 systemd[1]: session-11.scope: Consumed 2min 17.587s CPU time.
Jan 26 14:38:19 compute-1 systemd-logind[795]: Removed session 11.
Jan 26 14:38:19 compute-1 sshd-session[45229]: Invalid user btf from 185.246.128.170 port 45263
Jan 26 14:38:19 compute-1 sshd-session[45229]: Disconnecting invalid user btf 185.246.128.170 port 45263: Change of username or service not allowed: (btf,ssh-connection) -> (wss,ssh-connection) [preauth]
Jan 26 14:38:25 compute-1 sshd-session[45262]: Accepted publickey for zuul from 192.168.122.30 port 50120 ssh2: ECDSA SHA256:jdvz2lyr5EyIYu+bvBKw53Euf9HIejup115smDZmKeU
Jan 26 14:38:25 compute-1 systemd-logind[795]: New session 12 of user zuul.
Jan 26 14:38:25 compute-1 systemd[1]: Started Session 12 of User zuul.
Jan 26 14:38:25 compute-1 sshd-session[45262]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 14:38:26 compute-1 python3.9[45415]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 14:38:28 compute-1 python3.9[45569]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 14:38:29 compute-1 sudo[45723]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-stmzynkgyzlocxeigygcqidwncpccfio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438308.7889345-76-57388696609923/AnsiballZ_command.py'
Jan 26 14:38:29 compute-1 sudo[45723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:38:29 compute-1 python3.9[45725]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:38:29 compute-1 sudo[45723]: pam_unix(sudo:session): session closed for user root
Jan 26 14:38:30 compute-1 sshd-session[45260]: Invalid user wss from 185.246.128.170 port 63910
Jan 26 14:38:30 compute-1 python3.9[45876]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 14:38:31 compute-1 sudo[46030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azeeqhesuhcsozefuektyzjzheungbrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438311.2320538-116-221105794786181/AnsiballZ_setup.py'
Jan 26 14:38:31 compute-1 sudo[46030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:38:31 compute-1 sshd-session[45260]: Disconnecting invalid user wss 185.246.128.170 port 63910: Change of username or service not allowed: (wss,ssh-connection) -> (storage,ssh-connection) [preauth]
Jan 26 14:38:31 compute-1 python3.9[46032]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 14:38:32 compute-1 sudo[46030]: pam_unix(sudo:session): session closed for user root
Jan 26 14:38:32 compute-1 sudo[46114]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekdafnzlirgkjxrjrspnvzpwgcjutyzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438311.2320538-116-221105794786181/AnsiballZ_dnf.py'
Jan 26 14:38:32 compute-1 sudo[46114]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:38:32 compute-1 python3.9[46116]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 14:38:34 compute-1 sudo[46114]: pam_unix(sudo:session): session closed for user root
Jan 26 14:38:34 compute-1 sudo[46267]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dyiltanuvpxkpmwpjlwzazdlahtbczxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438314.3144915-140-36735293848434/AnsiballZ_setup.py'
Jan 26 14:38:34 compute-1 sudo[46267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:38:34 compute-1 python3.9[46269]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 14:38:35 compute-1 sudo[46267]: pam_unix(sudo:session): session closed for user root
Jan 26 14:38:35 compute-1 sudo[46440]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uizubuexixhqhjkdndtijxievubffzih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438315.3947074-162-124930447406043/AnsiballZ_file.py'
Jan 26 14:38:35 compute-1 sudo[46440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:38:36 compute-1 python3.9[46442]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:38:36 compute-1 sudo[46440]: pam_unix(sudo:session): session closed for user root
Jan 26 14:38:36 compute-1 sudo[46592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thlomrdsquzjbwqfgcjlimbxhcwnekoy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438316.3117545-178-104767031583826/AnsiballZ_command.py'
Jan 26 14:38:36 compute-1 sudo[46592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:38:36 compute-1 python3.9[46594]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:38:36 compute-1 systemd[1]: var-lib-containers-storage-overlay-compat2700351042-merged.mount: Deactivated successfully.
Jan 26 14:38:36 compute-1 systemd[1]: var-lib-containers-storage-overlay-metacopy\x2dcheck3929861836-merged.mount: Deactivated successfully.
Jan 26 14:38:36 compute-1 podman[46595]: 2026-01-26 14:38:36.889072042 +0000 UTC m=+0.054033109 system refresh
Jan 26 14:38:36 compute-1 sudo[46592]: pam_unix(sudo:session): session closed for user root
Jan 26 14:38:37 compute-1 sudo[46755]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbhuwdqfbnwgiaeuhziuuelwjenwaoto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438317.1291327-194-17823170071889/AnsiballZ_stat.py'
Jan 26 14:38:37 compute-1 sudo[46755]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:38:37 compute-1 python3.9[46757]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:38:37 compute-1 sudo[46755]: pam_unix(sudo:session): session closed for user root
Jan 26 14:38:37 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 14:38:38 compute-1 sudo[46878]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yaijnwsfcdyogyzfwftfobfkifgaxbrz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438317.1291327-194-17823170071889/AnsiballZ_copy.py'
Jan 26 14:38:38 compute-1 sudo[46878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:38:38 compute-1 sshd-session[46313]: Invalid user storage from 185.246.128.170 port 17624
Jan 26 14:38:38 compute-1 sshd-session[46313]: Disconnecting invalid user storage 185.246.128.170 port 17624: Change of username or service not allowed: (storage,ssh-connection) -> (mysql,ssh-connection) [preauth]
Jan 26 14:38:38 compute-1 python3.9[46880]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769438317.1291327-194-17823170071889/.source.json follow=False _original_basename=podman_network_config.j2 checksum=99b066161ce9a5f17ffceffa862004bc20297a4c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:38:38 compute-1 sudo[46878]: pam_unix(sudo:session): session closed for user root
Jan 26 14:38:39 compute-1 sudo[47032]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-byywebfyiwamtbiiawejpoufiuvxkilq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438318.8186839-224-217310001000693/AnsiballZ_stat.py'
Jan 26 14:38:39 compute-1 sudo[47032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:38:39 compute-1 python3.9[47034]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:38:39 compute-1 sudo[47032]: pam_unix(sudo:session): session closed for user root
Jan 26 14:38:40 compute-1 sudo[47155]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gesdxfeufpfbtjnjijetwdyxryyeaqku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438318.8186839-224-217310001000693/AnsiballZ_copy.py'
Jan 26 14:38:40 compute-1 sudo[47155]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:38:40 compute-1 python3.9[47157]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769438318.8186839-224-217310001000693/.source.conf follow=False _original_basename=registries.conf.j2 checksum=bcc59bf45bad7d60f115042e5f035d9bc2349601 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:38:40 compute-1 sudo[47155]: pam_unix(sudo:session): session closed for user root
Jan 26 14:38:40 compute-1 sudo[47307]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhjsqtuhjbkmkfihuspxwfiiiewcegcy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438320.501797-256-249757500911386/AnsiballZ_ini_file.py'
Jan 26 14:38:40 compute-1 sudo[47307]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:38:41 compute-1 python3.9[47309]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:38:41 compute-1 sudo[47307]: pam_unix(sudo:session): session closed for user root
Jan 26 14:38:41 compute-1 sudo[47459]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbklamlayvghdqxgjnaonjytgqatrtat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438321.2780354-256-199666708612394/AnsiballZ_ini_file.py'
Jan 26 14:38:41 compute-1 sudo[47459]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:38:41 compute-1 python3.9[47461]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:38:41 compute-1 sudo[47459]: pam_unix(sudo:session): session closed for user root
Jan 26 14:38:42 compute-1 sudo[47611]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgigypkzfxnucalcmmqyvezpljjgnsao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438321.938412-256-162053351770505/AnsiballZ_ini_file.py'
Jan 26 14:38:42 compute-1 sudo[47611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:38:42 compute-1 python3.9[47613]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:38:42 compute-1 sudo[47611]: pam_unix(sudo:session): session closed for user root
Jan 26 14:38:42 compute-1 sudo[47763]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqwdpbjskzycktfjgbadgczxmcrtlwmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438322.6669583-256-277372944264063/AnsiballZ_ini_file.py'
Jan 26 14:38:42 compute-1 sudo[47763]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:38:43 compute-1 python3.9[47765]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:38:43 compute-1 sudo[47763]: pam_unix(sudo:session): session closed for user root
Jan 26 14:38:44 compute-1 sshd-session[46928]: Invalid user mysql from 185.246.128.170 port 41456
Jan 26 14:38:44 compute-1 python3.9[47915]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 14:38:45 compute-1 sudo[48067]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdjkaovethhcefgwwnxhbmumsrdpykme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438324.8340535-336-183857787266052/AnsiballZ_dnf.py'
Jan 26 14:38:45 compute-1 sudo[48067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:38:45 compute-1 sshd-session[46928]: Disconnecting invalid user mysql 185.246.128.170 port 41456: Change of username or service not allowed: (mysql,ssh-connection) -> (guest,ssh-connection) [preauth]
Jan 26 14:38:45 compute-1 python3.9[48069]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 26 14:38:46 compute-1 sudo[48067]: pam_unix(sudo:session): session closed for user root
Jan 26 14:38:47 compute-1 sudo[48220]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slfdmgglpkrcgxrnlzkjzwdstreuwdtk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438326.839303-352-73543923704472/AnsiballZ_dnf.py'
Jan 26 14:38:47 compute-1 sudo[48220]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:38:47 compute-1 python3.9[48222]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 26 14:38:50 compute-1 sudo[48220]: pam_unix(sudo:session): session closed for user root
Jan 26 14:38:50 compute-1 sudo[48382]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlweprehrbacftfklvhjantampuettxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438330.5359306-372-174342357541366/AnsiballZ_dnf.py'
Jan 26 14:38:50 compute-1 sudo[48382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:38:51 compute-1 python3.9[48384]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 26 14:38:52 compute-1 sudo[48382]: pam_unix(sudo:session): session closed for user root
Jan 26 14:38:52 compute-1 sudo[48535]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykyrvhgzdmhjxgpkzwbhjmlslljwicdx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438332.7445123-390-280648702872838/AnsiballZ_dnf.py'
Jan 26 14:38:52 compute-1 sudo[48535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:38:53 compute-1 python3.9[48537]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 26 14:38:54 compute-1 sudo[48535]: pam_unix(sudo:session): session closed for user root
Jan 26 14:38:55 compute-1 sudo[48688]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihzblpybwysggwmsjqaigkchhufqfdjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438335.4511864-412-120385002428637/AnsiballZ_dnf.py'
Jan 26 14:38:55 compute-1 sudo[48688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:38:55 compute-1 python3.9[48690]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 26 14:38:56 compute-1 sshd-session[48224]: Invalid user guest from 185.246.128.170 port 53456
Jan 26 14:38:57 compute-1 sudo[48688]: pam_unix(sudo:session): session closed for user root
Jan 26 14:38:58 compute-1 sudo[48844]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvefriemqidwhwqtwzqdkvkznchozvhx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438337.7632434-428-203880919032742/AnsiballZ_dnf.py'
Jan 26 14:38:58 compute-1 sudo[48844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:38:58 compute-1 python3.9[48846]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 26 14:38:58 compute-1 sshd-session[48224]: Disconnecting invalid user guest 185.246.128.170 port 53456: Change of username or service not allowed: (guest,ssh-connection) -> (oper,ssh-connection) [preauth]
Jan 26 14:39:01 compute-1 sshd-session[48849]: Invalid user oper from 185.246.128.170 port 1869
Jan 26 14:39:02 compute-1 sshd-session[48849]: Disconnecting invalid user oper 185.246.128.170 port 1869: Change of username or service not allowed: (oper,ssh-connection) -> (anonymous,ssh-connection) [preauth]
Jan 26 14:39:04 compute-1 sudo[48844]: pam_unix(sudo:session): session closed for user root
Jan 26 14:39:05 compute-1 sudo[49016]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzxajpxpdokodpdukfgtbywfjpufefzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438344.7882123-446-130275653448211/AnsiballZ_dnf.py'
Jan 26 14:39:05 compute-1 sudo[49016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:39:05 compute-1 python3.9[49018]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 26 14:39:07 compute-1 sudo[49016]: pam_unix(sudo:session): session closed for user root
Jan 26 14:39:07 compute-1 sudo[49169]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yubimjqsngxiebthznpzzqecodbhyisg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438347.3176286-464-85316058549829/AnsiballZ_dnf.py'
Jan 26 14:39:07 compute-1 sudo[49169]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:39:07 compute-1 python3.9[49171]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 26 14:39:19 compute-1 sshd-session[49179]: Invalid user anonymous from 185.246.128.170 port 55909
Jan 26 14:39:25 compute-1 sshd-session[49179]: error: maximum authentication attempts exceeded for invalid user anonymous from 185.246.128.170 port 55909 ssh2 [preauth]
Jan 26 14:39:25 compute-1 sshd-session[49179]: Disconnecting invalid user anonymous 185.246.128.170 port 55909: Too many authentication failures [preauth]
Jan 26 14:39:38 compute-1 sshd-session[49185]: Invalid user anonymous from 185.246.128.170 port 47714
Jan 26 14:39:41 compute-1 sudo[49169]: pam_unix(sudo:session): session closed for user root
Jan 26 14:39:42 compute-1 sudo[49509]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eogqhfhpswabamgeiolaanqucougeodj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438381.8295574-482-23807399637605/AnsiballZ_dnf.py'
Jan 26 14:39:42 compute-1 sudo[49509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:39:42 compute-1 python3.9[49511]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['iscsi-initiator-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 26 14:39:43 compute-1 sudo[49509]: pam_unix(sudo:session): session closed for user root
Jan 26 14:39:44 compute-1 sudo[49665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxqmjvsmgnsanbfxcssycvouztafqihg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438384.087175-502-120698833872750/AnsiballZ_dnf.py'
Jan 26 14:39:44 compute-1 sudo[49665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:39:44 compute-1 python3.9[49667]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['device-mapper-multipath'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 26 14:39:46 compute-1 sshd-session[49185]: Disconnecting invalid user anonymous 185.246.128.170 port 47714: Change of username or service not allowed: (anonymous,ssh-connection) -> (timothy,ssh-connection) [preauth]
Jan 26 14:39:46 compute-1 sudo[49665]: pam_unix(sudo:session): session closed for user root
Jan 26 14:39:47 compute-1 sudo[49822]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgxsvszdsyxsmwmuqizertupdgzcpuxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438386.9843178-524-50848783195518/AnsiballZ_file.py'
Jan 26 14:39:47 compute-1 sudo[49822]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:39:47 compute-1 python3.9[49824]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:39:47 compute-1 sudo[49822]: pam_unix(sudo:session): session closed for user root
Jan 26 14:39:48 compute-1 sudo[49997]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-veazagqpvxmjvzbnyrnrorkgllmmwspi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438387.6676054-540-243608257722694/AnsiballZ_stat.py'
Jan 26 14:39:48 compute-1 sudo[49997]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:39:48 compute-1 python3.9[49999]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:39:48 compute-1 sudo[49997]: pam_unix(sudo:session): session closed for user root
Jan 26 14:39:48 compute-1 sudo[50120]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aaarmshxidklanvmkinwuonabwvpston ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438387.6676054-540-243608257722694/AnsiballZ_copy.py'
Jan 26 14:39:48 compute-1 sudo[50120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:39:48 compute-1 python3.9[50122]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1769438387.6676054-540-243608257722694/.source.json _original_basename=.6qxuaj2k follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:39:48 compute-1 sudo[50120]: pam_unix(sudo:session): session closed for user root
Jan 26 14:39:49 compute-1 sudo[50274]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-deavryxcradtpjvzfzxudurzgaomiysi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438389.2698715-576-61258561339563/AnsiballZ_podman_image.py'
Jan 26 14:39:49 compute-1 sudo[50274]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:39:50 compute-1 python3.9[50276]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 26 14:39:50 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 14:39:52 compute-1 sshd-session[50123]: Invalid user timothy from 185.246.128.170 port 49700
Jan 26 14:39:53 compute-1 systemd[1]: var-lib-containers-storage-overlay-compat1909164322-lower\x2dmapped.mount: Deactivated successfully.
Jan 26 14:39:56 compute-1 sshd-session[50123]: Disconnecting invalid user timothy 185.246.128.170 port 49700: Change of username or service not allowed: (timothy,ssh-connection) -> (joe,ssh-connection) [preauth]
Jan 26 14:40:01 compute-1 podman[50288]: 2026-01-26 14:40:01.701297696 +0000 UTC m=+11.628851466 image pull f8729094371621355e0152ff34e85f25e048ce5f2426134c9fea76fcb24d5c9d 38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest
Jan 26 14:40:01 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 14:40:01 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 14:40:01 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 14:40:01 compute-1 sudo[50274]: pam_unix(sudo:session): session closed for user root
Jan 26 14:40:02 compute-1 sshd-session[50385]: Invalid user joe from 185.246.128.170 port 23427
Jan 26 14:40:02 compute-1 sudo[50579]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cyfkfxbaeqksyinqeyqmumjudgaifusp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438402.388775-598-13540372434726/AnsiballZ_podman_image.py'
Jan 26 14:40:02 compute-1 sudo[50579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:40:02 compute-1 python3.9[50581]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 26 14:40:02 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 14:40:03 compute-1 sshd-session[50385]: Disconnecting invalid user joe 185.246.128.170 port 23427: Change of username or service not allowed: (joe,ssh-connection) -> (vncuser,ssh-connection) [preauth]
Jan 26 14:40:07 compute-1 sshd-session[50644]: Invalid user vncuser from 185.246.128.170 port 60478
Jan 26 14:40:07 compute-1 sshd-session[50644]: Disconnecting invalid user vncuser 185.246.128.170 port 60478: Change of username or service not allowed: (vncuser,ssh-connection) -> (pi,ssh-connection) [preauth]
Jan 26 14:40:10 compute-1 sshd-session[50646]: Invalid user pi from 185.246.128.170 port 24754
Jan 26 14:40:17 compute-1 sshd-session[50646]: Disconnecting invalid user pi 185.246.128.170 port 24754: Change of username or service not allowed: (pi,ssh-connection) -> (admin1,ssh-connection) [preauth]
Jan 26 14:40:22 compute-1 podman[50593]: 2026-01-26 14:40:22.998820034 +0000 UTC m=+20.082744940 image pull d5bf96c5225682608353c2a38183b39c74c7c48343b54a579b3b6f3d81996637 38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Jan 26 14:40:23 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 14:40:23 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 14:40:23 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 14:40:23 compute-1 sudo[50579]: pam_unix(sudo:session): session closed for user root
Jan 26 14:40:23 compute-1 sudo[50894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhbbsrcaogaetktqsvpvuuqcyufnrbts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438423.6431634-618-217843235029746/AnsiballZ_podman_image.py'
Jan 26 14:40:23 compute-1 sudo[50894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:40:24 compute-1 python3.9[50896]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.230:5001/podified-master-centos10/openstack-nova-compute:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 26 14:40:24 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 14:40:27 compute-1 sshd-session[50695]: Invalid user admin1 from 185.246.128.170 port 56298
Jan 26 14:40:35 compute-1 sshd-session[50695]: error: maximum authentication attempts exceeded for invalid user admin1 from 185.246.128.170 port 56298 ssh2 [preauth]
Jan 26 14:40:35 compute-1 sshd-session[50695]: Disconnecting invalid user admin1 185.246.128.170 port 56298: Too many authentication failures [preauth]
Jan 26 14:40:36 compute-1 podman[50908]: 2026-01-26 14:40:36.780131565 +0000 UTC m=+12.547768403 image pull dcf510f4656465f698906cac740f99e5970bfc138793d2c5abda6beb4ca068f0 38.102.83.230:5001/podified-master-centos10/openstack-nova-compute:watcher_latest
Jan 26 14:40:36 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 14:40:36 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 14:40:36 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 14:40:36 compute-1 sudo[50894]: pam_unix(sudo:session): session closed for user root
Jan 26 14:40:43 compute-1 sshd-session[51136]: Invalid user admin1 from 185.246.128.170 port 29870
Jan 26 14:40:45 compute-1 sudo[51263]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rrbvvbsmxetwkerurkvwvjtzcecwvdkt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438445.117198-640-101150559124426/AnsiballZ_podman_image.py'
Jan 26 14:40:45 compute-1 sudo[51263]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:40:46 compute-1 python3.9[51265]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.230:5001/podified-master-centos10/openstack-ceilometer-compute:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 26 14:40:46 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 14:40:49 compute-1 sshd-session[51136]: Disconnecting invalid user admin1 185.246.128.170 port 29870: Change of username or service not allowed: (admin1,ssh-connection) -> (aovalle,ssh-connection) [preauth]
Jan 26 14:40:53 compute-1 podman[51277]: 2026-01-26 14:40:53.623711091 +0000 UTC m=+7.087477375 image pull fcf8a01ada948304d9fcffc30f8120a2584a61c07a88bea05d9b74b320860d4d 38.102.83.230:5001/podified-master-centos10/openstack-ceilometer-compute:watcher_latest
Jan 26 14:40:53 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 14:40:53 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 14:40:53 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 14:40:53 compute-1 sudo[51263]: pam_unix(sudo:session): session closed for user root
Jan 26 14:40:54 compute-1 sudo[51534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-snbbvppshghfnpjbexhzudyctofcqvea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438454.0510983-640-275330412340979/AnsiballZ_podman_image.py'
Jan 26 14:40:54 compute-1 sudo[51534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:40:54 compute-1 python3.9[51536]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 26 14:40:55 compute-1 sshd-session[51333]: Invalid user aovalle from 185.246.128.170 port 52682
Jan 26 14:40:58 compute-1 sshd-session[51333]: Disconnecting invalid user aovalle 185.246.128.170 port 52682: Change of username or service not allowed: (aovalle,ssh-connection) -> (minima,ssh-connection) [preauth]
Jan 26 14:41:02 compute-1 podman[51548]: 2026-01-26 14:41:02.814551367 +0000 UTC m=+8.190696322 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Jan 26 14:41:02 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 14:41:02 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 14:41:02 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 14:41:03 compute-1 sudo[51534]: pam_unix(sudo:session): session closed for user root
Jan 26 14:41:03 compute-1 sshd-session[45265]: Connection closed by 192.168.122.30 port 50120
Jan 26 14:41:03 compute-1 sshd-session[45262]: pam_unix(sshd:session): session closed for user zuul
Jan 26 14:41:03 compute-1 systemd[1]: session-12.scope: Deactivated successfully.
Jan 26 14:41:03 compute-1 systemd[1]: session-12.scope: Consumed 1min 53.470s CPU time.
Jan 26 14:41:03 compute-1 systemd-logind[795]: Session 12 logged out. Waiting for processes to exit.
Jan 26 14:41:03 compute-1 systemd-logind[795]: Removed session 12.
Jan 26 14:41:08 compute-1 sshd-session[51700]: Accepted publickey for zuul from 192.168.122.30 port 41798 ssh2: ECDSA SHA256:jdvz2lyr5EyIYu+bvBKw53Euf9HIejup115smDZmKeU
Jan 26 14:41:08 compute-1 systemd-logind[795]: New session 13 of user zuul.
Jan 26 14:41:08 compute-1 systemd[1]: Started Session 13 of User zuul.
Jan 26 14:41:08 compute-1 sshd-session[51700]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 14:41:08 compute-1 sshd-session[51698]: Invalid user minima from 185.246.128.170 port 49905
Jan 26 14:41:11 compute-1 python3.9[51853]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 14:41:12 compute-1 sudo[52007]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzlsijlhzlwoexavdkprlqwwzjnpvgtb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438471.8815665-48-222976404829232/AnsiballZ_getent.py'
Jan 26 14:41:12 compute-1 sudo[52007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:41:12 compute-1 python3.9[52009]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Jan 26 14:41:12 compute-1 sudo[52007]: pam_unix(sudo:session): session closed for user root
Jan 26 14:41:13 compute-1 sudo[52160]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-urdymueukvjhsgpnmjywxygzoikyepxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438472.7088895-64-97322587718892/AnsiballZ_group.py'
Jan 26 14:41:13 compute-1 sudo[52160]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:41:13 compute-1 python3.9[52162]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 26 14:41:13 compute-1 groupadd[52163]: group added to /etc/group: name=openvswitch, GID=42476
Jan 26 14:41:13 compute-1 groupadd[52163]: group added to /etc/gshadow: name=openvswitch
Jan 26 14:41:13 compute-1 groupadd[52163]: new group: name=openvswitch, GID=42476
Jan 26 14:41:13 compute-1 sudo[52160]: pam_unix(sudo:session): session closed for user root
Jan 26 14:41:15 compute-1 sudo[52318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxrnsfkytmcnlngunjgyhiqbcxsswtjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438475.0993319-80-146429565215773/AnsiballZ_user.py'
Jan 26 14:41:15 compute-1 sudo[52318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:41:15 compute-1 python3.9[52320]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 26 14:41:16 compute-1 useradd[52322]: new user: name=openvswitch, UID=42476, GID=42476, home=/home/openvswitch, shell=/sbin/nologin, from=/dev/pts/0
Jan 26 14:41:16 compute-1 sshd-session[51698]: Disconnecting invalid user minima 185.246.128.170 port 49905: Change of username or service not allowed: (minima,ssh-connection) -> (nsroot,ssh-connection) [preauth]
Jan 26 14:41:16 compute-1 useradd[52322]: add 'openvswitch' to group 'hugetlbfs'
Jan 26 14:41:16 compute-1 useradd[52322]: add 'openvswitch' to shadow group 'hugetlbfs'
Jan 26 14:41:18 compute-1 sudo[52318]: pam_unix(sudo:session): session closed for user root
Jan 26 14:41:18 compute-1 sudo[52480]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxwxteuxelwgqbsdoigvbjuajnfbcwxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438478.5750992-100-276948750123618/AnsiballZ_setup.py'
Jan 26 14:41:18 compute-1 sudo[52480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:41:19 compute-1 python3.9[52482]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 14:41:19 compute-1 sudo[52480]: pam_unix(sudo:session): session closed for user root
Jan 26 14:41:19 compute-1 sudo[52564]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irsfehsgfkhovssrbooiwzauvefxnazg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438478.5750992-100-276948750123618/AnsiballZ_dnf.py'
Jan 26 14:41:19 compute-1 sudo[52564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:41:20 compute-1 python3.9[52566]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 26 14:41:24 compute-1 sudo[52564]: pam_unix(sudo:session): session closed for user root
Jan 26 14:41:24 compute-1 sudo[52725]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qbwlxehjjsblezhuchpkdcjaxwaumied ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438484.5056071-128-228240058405543/AnsiballZ_dnf.py'
Jan 26 14:41:24 compute-1 sudo[52725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:41:25 compute-1 python3.9[52727]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 14:41:26 compute-1 sshd-session[52329]: Invalid user nsroot from 185.246.128.170 port 35524
Jan 26 14:41:27 compute-1 sshd-session[52329]: Disconnecting invalid user nsroot 185.246.128.170 port 35524: Change of username or service not allowed: (nsroot,ssh-connection) -> (dock,ssh-connection) [preauth]
Jan 26 14:41:32 compute-1 sshd-session[52742]: Invalid user dock from 185.246.128.170 port 9835
Jan 26 14:41:33 compute-1 sshd-session[52742]: Disconnecting invalid user dock 185.246.128.170 port 9835: Change of username or service not allowed: (dock,ssh-connection) -> (123456,ssh-connection) [preauth]
Jan 26 14:41:43 compute-1 kernel: SELinux:  Converting 2738 SID table entries...
Jan 26 14:41:43 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Jan 26 14:41:43 compute-1 kernel: SELinux:  policy capability open_perms=1
Jan 26 14:41:43 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Jan 26 14:41:43 compute-1 kernel: SELinux:  policy capability always_check_network=0
Jan 26 14:41:43 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 26 14:41:43 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 26 14:41:43 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 26 14:41:43 compute-1 groupadd[52753]: group added to /etc/group: name=unbound, GID=994
Jan 26 14:41:43 compute-1 groupadd[52753]: group added to /etc/gshadow: name=unbound
Jan 26 14:41:43 compute-1 groupadd[52753]: new group: name=unbound, GID=994
Jan 26 14:41:43 compute-1 useradd[52760]: new user: name=unbound, UID=993, GID=994, home=/var/lib/unbound, shell=/sbin/nologin, from=none
Jan 26 14:41:44 compute-1 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=7 res=1
Jan 26 14:41:44 compute-1 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Jan 26 14:41:46 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 26 14:41:46 compute-1 systemd[1]: Starting man-db-cache-update.service...
Jan 26 14:41:46 compute-1 systemd[1]: Reloading.
Jan 26 14:41:46 compute-1 systemd-rc-local-generator[53254]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:41:46 compute-1 systemd-sysv-generator[53260]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 14:41:46 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 26 14:41:49 compute-1 sshd-session[52748]: Invalid user 123456 from 185.246.128.170 port 61489
Jan 26 14:41:50 compute-1 sshd-session[52748]: Disconnecting invalid user 123456 185.246.128.170 port 61489: Change of username or service not allowed: (123456,ssh-connection) -> (tunnel,ssh-connection) [preauth]
Jan 26 14:41:50 compute-1 sudo[52725]: pam_unix(sudo:session): session closed for user root
Jan 26 14:41:50 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 26 14:41:50 compute-1 systemd[1]: Finished man-db-cache-update.service.
Jan 26 14:41:50 compute-1 systemd[1]: run-rc88424ebb0384cb08e52fa41719b10ee.service: Deactivated successfully.
Jan 26 14:41:51 compute-1 sudo[53827]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbyyhrotnmfhwcjipmjapzcndlfpiypl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438510.5037036-144-44638381460563/AnsiballZ_systemd.py'
Jan 26 14:41:51 compute-1 sudo[53827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:41:51 compute-1 python3.9[53829]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 26 14:41:51 compute-1 systemd[1]: Reloading.
Jan 26 14:41:51 compute-1 systemd-sysv-generator[53864]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 14:41:51 compute-1 systemd-rc-local-generator[53859]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:41:51 compute-1 systemd[1]: Starting Open vSwitch Database Unit...
Jan 26 14:41:51 compute-1 chown[53872]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Jan 26 14:41:51 compute-1 ovs-ctl[53877]: /etc/openvswitch/conf.db does not exist ... (warning).
Jan 26 14:41:52 compute-1 ovs-ctl[53877]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Jan 26 14:41:52 compute-1 ovs-ctl[53877]: Starting ovsdb-server [  OK  ]
Jan 26 14:41:52 compute-1 ovs-vsctl[53927]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Jan 26 14:41:53 compute-1 ovs-vsctl[53943]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"41380b5a-e321-4ce4-bcc6-ecd563b3c793\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Jan 26 14:41:53 compute-1 ovs-ctl[53877]: Configuring Open vSwitch system IDs [  OK  ]
Jan 26 14:41:53 compute-1 ovs-ctl[53877]: Enabling remote OVSDB managers [  OK  ]
Jan 26 14:41:53 compute-1 systemd[1]: Started Open vSwitch Database Unit.
Jan 26 14:41:53 compute-1 ovs-vsctl[53953]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Jan 26 14:41:53 compute-1 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Jan 26 14:41:53 compute-1 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Jan 26 14:41:53 compute-1 systemd[1]: Starting Open vSwitch Forwarding Unit...
Jan 26 14:41:53 compute-1 kernel: openvswitch: Open vSwitch switching datapath
Jan 26 14:41:53 compute-1 ovs-ctl[53997]: Inserting openvswitch module [  OK  ]
Jan 26 14:41:53 compute-1 ovs-ctl[53966]: Starting ovs-vswitchd [  OK  ]
Jan 26 14:41:53 compute-1 ovs-ctl[53966]: Enabling remote OVSDB managers [  OK  ]
Jan 26 14:41:53 compute-1 systemd[1]: Started Open vSwitch Forwarding Unit.
Jan 26 14:41:53 compute-1 ovs-vsctl[54016]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Jan 26 14:41:53 compute-1 systemd[1]: Starting Open vSwitch...
Jan 26 14:41:53 compute-1 systemd[1]: Finished Open vSwitch.
Jan 26 14:41:53 compute-1 sudo[53827]: pam_unix(sudo:session): session closed for user root
Jan 26 14:41:54 compute-1 python3.9[54168]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 14:41:54 compute-1 sshd-session[53922]: Invalid user tunnel from 185.246.128.170 port 16024
Jan 26 14:41:54 compute-1 sudo[54318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlgxekxhpikcxkmefspisorefipmxyqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438514.4844265-180-168238612971855/AnsiballZ_sefcontext.py'
Jan 26 14:41:54 compute-1 sudo[54318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:41:55 compute-1 sshd-session[53922]: Disconnecting invalid user tunnel 185.246.128.170 port 16024: Change of username or service not allowed: (tunnel,ssh-connection) -> (kubelet,ssh-connection) [preauth]
Jan 26 14:41:55 compute-1 python3.9[54320]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Jan 26 14:41:57 compute-1 kernel: SELinux:  Converting 2752 SID table entries...
Jan 26 14:41:57 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Jan 26 14:41:57 compute-1 kernel: SELinux:  policy capability open_perms=1
Jan 26 14:41:57 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Jan 26 14:41:57 compute-1 kernel: SELinux:  policy capability always_check_network=0
Jan 26 14:41:57 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 26 14:41:57 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 26 14:41:57 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 26 14:41:57 compute-1 sudo[54318]: pam_unix(sudo:session): session closed for user root
Jan 26 14:41:58 compute-1 python3.9[54475]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 14:41:58 compute-1 sudo[54631]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfqpxqjfpwbsxqcrbeqngbmwvownloxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438518.6765192-216-148598366234333/AnsiballZ_dnf.py'
Jan 26 14:41:58 compute-1 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Jan 26 14:41:58 compute-1 sudo[54631]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:41:59 compute-1 python3.9[54633]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 14:42:00 compute-1 sudo[54631]: pam_unix(sudo:session): session closed for user root
Jan 26 14:42:01 compute-1 sudo[54786]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqzjzliiegcengsjsgbnsmpicdjtirxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438520.9428122-232-188550844714552/AnsiballZ_command.py'
Jan 26 14:42:01 compute-1 sudo[54786]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:42:01 compute-1 python3.9[54788]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:42:02 compute-1 sudo[54786]: pam_unix(sudo:session): session closed for user root
Jan 26 14:42:03 compute-1 sudo[55073]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnibeyzoikgjruhdahargrsfnvwhqqhe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438522.590315-248-39734029874044/AnsiballZ_file.py'
Jan 26 14:42:03 compute-1 sudo[55073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:42:03 compute-1 python3.9[55075]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Jan 26 14:42:03 compute-1 sudo[55073]: pam_unix(sudo:session): session closed for user root
Jan 26 14:42:04 compute-1 python3.9[55225]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 14:42:04 compute-1 sudo[55377]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnvlufiggkxlmcjyobyhcfmmuzzncbdn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438524.4875412-280-280047760270217/AnsiballZ_dnf.py'
Jan 26 14:42:04 compute-1 sudo[55377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:42:04 compute-1 python3.9[55379]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 14:42:05 compute-1 sshd-session[54634]: Invalid user kubelet from 185.246.128.170 port 47750
Jan 26 14:42:05 compute-1 sshd-session[54634]: Disconnecting invalid user kubelet 185.246.128.170 port 47750: Change of username or service not allowed: (kubelet,ssh-connection) -> (liuj,ssh-connection) [preauth]
Jan 26 14:42:07 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 26 14:42:07 compute-1 systemd[1]: Starting man-db-cache-update.service...
Jan 26 14:42:07 compute-1 systemd[1]: Reloading.
Jan 26 14:42:07 compute-1 systemd-rc-local-generator[55415]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:42:07 compute-1 systemd-sysv-generator[55418]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 14:42:07 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 26 14:42:08 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 26 14:42:08 compute-1 systemd[1]: Finished man-db-cache-update.service.
Jan 26 14:42:08 compute-1 systemd[1]: run-r6b1a795b5f9d4d4f9041232a6aafd9dd.service: Deactivated successfully.
Jan 26 14:42:08 compute-1 sudo[55377]: pam_unix(sudo:session): session closed for user root
Jan 26 14:42:08 compute-1 sudo[55696]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oitoicqomfxmjdcqlwfqsuwozvlilrql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438528.48772-296-122223202473640/AnsiballZ_systemd.py'
Jan 26 14:42:08 compute-1 sudo[55696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:42:09 compute-1 python3.9[55698]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 14:42:09 compute-1 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 26 14:42:09 compute-1 systemd[1]: Stopped Network Manager Wait Online.
Jan 26 14:42:09 compute-1 systemd[1]: Stopping Network Manager Wait Online...
Jan 26 14:42:09 compute-1 systemd[1]: Stopping Network Manager...
Jan 26 14:42:09 compute-1 NetworkManager[7198]: <info>  [1769438529.1435] caught SIGTERM, shutting down normally.
Jan 26 14:42:09 compute-1 NetworkManager[7198]: <info>  [1769438529.1453] dhcp4 (eth0): canceled DHCP transaction
Jan 26 14:42:09 compute-1 NetworkManager[7198]: <info>  [1769438529.1453] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 26 14:42:09 compute-1 NetworkManager[7198]: <info>  [1769438529.1453] dhcp4 (eth0): state changed no lease
Jan 26 14:42:09 compute-1 NetworkManager[7198]: <info>  [1769438529.1458] manager: NetworkManager state is now CONNECTED_SITE
Jan 26 14:42:09 compute-1 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 26 14:42:09 compute-1 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 26 14:42:09 compute-1 NetworkManager[7198]: <info>  [1769438529.2263] exiting (success)
Jan 26 14:42:09 compute-1 systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 26 14:42:09 compute-1 systemd[1]: Stopped Network Manager.
Jan 26 14:42:09 compute-1 systemd[1]: NetworkManager.service: Consumed 14.246s CPU time, 4.3M memory peak, read 0B from disk, written 39.0K to disk.
Jan 26 14:42:09 compute-1 systemd[1]: Starting Network Manager...
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.2789] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:ac2670c8-0eba-49ef-ba2e-d02b046debf0)
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.2789] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.2845] manager[0x55e50120d000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 26 14:42:09 compute-1 systemd[1]: Starting Hostname Service...
Jan 26 14:42:09 compute-1 systemd[1]: Started Hostname Service.
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.3643] hostname: hostname: using hostnamed
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.3644] hostname: static hostname changed from (none) to "compute-1"
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.3650] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.3655] manager[0x55e50120d000]: rfkill: Wi-Fi hardware radio set enabled
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.3655] manager[0x55e50120d000]: rfkill: WWAN hardware radio set enabled
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.3677] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-ovs.so)
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.3687] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.3688] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.3689] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.3690] manager: Networking is enabled by state file
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.3692] settings: Loaded settings plugin: keyfile (internal)
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.3696] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.3724] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.3733] dhcp: init: Using DHCP client 'internal'
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.3735] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.3741] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.3746] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.3754] device (lo): Activation: starting connection 'lo' (90cc952b-85d2-4ca7-a327-6d073fb6794e)
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.3761] device (eth0): carrier: link connected
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.3765] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.3771] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.3771] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.3777] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.3784] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.3789] device (eth1): carrier: link connected
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.3793] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.3798] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (1467aadd-b515-5a03-83b1-dc086af911e2) (indicated)
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.3800] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.3807] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.3819] device (eth1): Activation: starting connection 'ci-private-network' (1467aadd-b515-5a03-83b1-dc086af911e2)
Jan 26 14:42:09 compute-1 systemd[1]: Started Network Manager.
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.3865] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.3872] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.3875] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.3876] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.3878] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.3880] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.3881] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.3883] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.3885] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.3890] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.3893] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.3901] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.3911] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.3917] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.3920] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.3924] device (lo): Activation: successful, device activated.
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.3929] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.3931] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.3934] manager: NetworkManager state is now CONNECTED_LOCAL
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.3935] device (eth1): Activation: successful, device activated.
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.3941] dhcp4 (eth0): state changed new lease, address=38.102.83.217
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.3947] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 26 14:42:09 compute-1 sudo[55696]: pam_unix(sudo:session): session closed for user root
Jan 26 14:42:09 compute-1 systemd[1]: Starting Network Manager Wait Online...
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.5913] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.5941] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.5943] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.5947] manager: NetworkManager state is now CONNECTED_SITE
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.5950] device (eth0): Activation: successful, device activated.
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.5957] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 26 14:42:09 compute-1 NetworkManager[55716]: <info>  [1769438529.5960] manager: startup complete
Jan 26 14:42:09 compute-1 systemd[1]: Finished Network Manager Wait Online.
Jan 26 14:42:09 compute-1 sudo[55922]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxdxsxwbhpyklnkqkolhndlqvguluqyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438529.6531847-312-162336946026536/AnsiballZ_dnf.py'
Jan 26 14:42:09 compute-1 sudo[55922]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:42:10 compute-1 python3.9[55924]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 14:42:13 compute-1 sshd-session[55384]: Invalid user liuj from 185.246.128.170 port 32588
Jan 26 14:42:18 compute-1 sshd-session[55384]: Disconnecting invalid user liuj 185.246.128.170 port 32588: Change of username or service not allowed: (liuj,ssh-connection) -> (ali,ssh-connection) [preauth]
Jan 26 14:42:19 compute-1 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 26 14:42:20 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 26 14:42:20 compute-1 systemd[1]: Starting man-db-cache-update.service...
Jan 26 14:42:20 compute-1 systemd[1]: Reloading.
Jan 26 14:42:20 compute-1 systemd-rc-local-generator[55980]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:42:20 compute-1 systemd-sysv-generator[55983]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 14:42:20 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 26 14:42:23 compute-1 sshd-session[55943]: Invalid user ali from 185.246.128.170 port 36625
Jan 26 14:42:24 compute-1 sshd-session[55943]: Disconnecting invalid user ali 185.246.128.170 port 36625: Change of username or service not allowed: (ali,ssh-connection) -> (auditadm,ssh-connection) [preauth]
Jan 26 14:42:27 compute-1 sudo[55922]: pam_unix(sudo:session): session closed for user root
Jan 26 14:42:27 compute-1 sudo[56385]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kslmkhmyptzknfvocqtaolwsklgqabfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438547.5325024-336-204880378902013/AnsiballZ_stat.py'
Jan 26 14:42:27 compute-1 sudo[56385]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:42:28 compute-1 python3.9[56387]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 14:42:28 compute-1 sudo[56385]: pam_unix(sudo:session): session closed for user root
Jan 26 14:42:28 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 26 14:42:28 compute-1 systemd[1]: Finished man-db-cache-update.service.
Jan 26 14:42:28 compute-1 systemd[1]: run-r58d59cc901dc4ce285ba087f86da5c67.service: Deactivated successfully.
Jan 26 14:42:28 compute-1 sudo[56538]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvtksnrwyfqoebccifzxnficghubcuny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438548.2864642-354-191517681095749/AnsiballZ_ini_file.py'
Jan 26 14:42:28 compute-1 sudo[56538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:42:29 compute-1 python3.9[56540]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:42:29 compute-1 sudo[56538]: pam_unix(sudo:session): session closed for user root
Jan 26 14:42:29 compute-1 sudo[56692]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzjpobrfrfudsssabchbbemqrktfswjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438549.4765396-374-242695048871355/AnsiballZ_ini_file.py'
Jan 26 14:42:29 compute-1 sudo[56692]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:42:29 compute-1 python3.9[56694]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:42:29 compute-1 sudo[56692]: pam_unix(sudo:session): session closed for user root
Jan 26 14:42:30 compute-1 sudo[56844]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnqevxderhyerurhsvqwynpnsjzjlptl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438550.1143782-374-263828189532680/AnsiballZ_ini_file.py'
Jan 26 14:42:30 compute-1 sudo[56844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:42:30 compute-1 python3.9[56846]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:42:30 compute-1 sudo[56844]: pam_unix(sudo:session): session closed for user root
Jan 26 14:42:31 compute-1 sudo[56996]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tnryjxwfmwjjiwthimnarmctikzlllbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438551.0505438-404-96298588222750/AnsiballZ_ini_file.py'
Jan 26 14:42:31 compute-1 sudo[56996]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:42:31 compute-1 python3.9[56998]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:42:31 compute-1 sudo[56996]: pam_unix(sudo:session): session closed for user root
Jan 26 14:42:32 compute-1 sudo[57148]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkzylfzmjdqowohwotrusuyetplsdqao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438551.7346222-404-28084085060453/AnsiballZ_ini_file.py'
Jan 26 14:42:32 compute-1 sudo[57148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:42:32 compute-1 python3.9[57150]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:42:32 compute-1 sudo[57148]: pam_unix(sudo:session): session closed for user root
Jan 26 14:42:32 compute-1 sshd-session[56258]: Invalid user auditadm from 185.246.128.170 port 37904
Jan 26 14:42:32 compute-1 sudo[57300]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uugqxpbtfuyhsnbxeqpqyzmawpxflead ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438552.639634-434-23646343390152/AnsiballZ_stat.py'
Jan 26 14:42:32 compute-1 sudo[57300]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:42:33 compute-1 python3.9[57302]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:42:33 compute-1 sudo[57300]: pam_unix(sudo:session): session closed for user root
Jan 26 14:42:33 compute-1 sudo[57423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpiyzmxzxkmlbjewqwjslqdsnxaptsvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438552.639634-434-23646343390152/AnsiballZ_copy.py'
Jan 26 14:42:33 compute-1 sudo[57423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:42:34 compute-1 python3.9[57425]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769438552.639634-434-23646343390152/.source _original_basename=.ber62r_6 follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:42:34 compute-1 sudo[57423]: pam_unix(sudo:session): session closed for user root
Jan 26 14:42:34 compute-1 sshd-session[56258]: Disconnecting invalid user auditadm 185.246.128.170 port 37904: Change of username or service not allowed: (auditadm,ssh-connection) -> (ubnt,ssh-connection) [preauth]
Jan 26 14:42:34 compute-1 sudo[57575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osevdxuutntjqvvbfpwvrubqutjwjkhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438554.3516555-464-5022558796600/AnsiballZ_file.py'
Jan 26 14:42:34 compute-1 sudo[57575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:42:34 compute-1 python3.9[57577]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:42:34 compute-1 sudo[57575]: pam_unix(sudo:session): session closed for user root
Jan 26 14:42:35 compute-1 sudo[57727]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbvmrxvxmcfatsokzknjtdwzcetsbguo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438555.1302748-480-145696649056517/AnsiballZ_edpm_os_net_config_mappings.py'
Jan 26 14:42:35 compute-1 sudo[57727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:42:35 compute-1 python3.9[57729]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Jan 26 14:42:35 compute-1 sudo[57727]: pam_unix(sudo:session): session closed for user root
Jan 26 14:42:36 compute-1 sudo[57879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-timoojzbfvavvbffkkxtgoimpzgvntjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438556.1472418-498-64582637584207/AnsiballZ_file.py'
Jan 26 14:42:36 compute-1 sudo[57879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:42:36 compute-1 python3.9[57881]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:42:36 compute-1 sudo[57879]: pam_unix(sudo:session): session closed for user root
Jan 26 14:42:39 compute-1 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 26 14:42:41 compute-1 sudo[58035]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qthzejdtyfdxbqkkaymzdacqgvplpsgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438561.2780302-518-231982965227122/AnsiballZ_stat.py'
Jan 26 14:42:41 compute-1 sudo[58035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:42:41 compute-1 sudo[58035]: pam_unix(sudo:session): session closed for user root
Jan 26 14:42:42 compute-1 sudo[58158]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzvgueunjhwaobadujtsyuvaiicbreyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438561.2780302-518-231982965227122/AnsiballZ_copy.py'
Jan 26 14:42:42 compute-1 sudo[58158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:42:42 compute-1 sudo[58158]: pam_unix(sudo:session): session closed for user root
Jan 26 14:42:42 compute-1 sudo[58310]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dsbwrgjwjjycjqvufkfhmkvqiwaxunzb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438562.544895-548-108522049520204/AnsiballZ_slurp.py'
Jan 26 14:42:42 compute-1 sudo[58310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:42:43 compute-1 python3.9[58312]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Jan 26 14:42:43 compute-1 sudo[58310]: pam_unix(sudo:session): session closed for user root
Jan 26 14:42:44 compute-1 sudo[58485]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oergtjjhapbaleemwwypmeufqbqcgwnw ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438563.468244-566-147761481394453/async_wrapper.py j167425024556 300 /home/zuul/.ansible/tmp/ansible-tmp-1769438563.468244-566-147761481394453/AnsiballZ_edpm_os_net_config.py _'
Jan 26 14:42:44 compute-1 sudo[58485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:42:44 compute-1 ansible-async_wrapper.py[58487]: Invoked with j167425024556 300 /home/zuul/.ansible/tmp/ansible-tmp-1769438563.468244-566-147761481394453/AnsiballZ_edpm_os_net_config.py _
Jan 26 14:42:44 compute-1 ansible-async_wrapper.py[58490]: Starting module and watcher
Jan 26 14:42:44 compute-1 ansible-async_wrapper.py[58490]: Start watching 58491 (300)
Jan 26 14:42:44 compute-1 ansible-async_wrapper.py[58491]: Start module (58491)
Jan 26 14:42:44 compute-1 ansible-async_wrapper.py[58487]: Return async_wrapper task started.
Jan 26 14:42:44 compute-1 sudo[58485]: pam_unix(sudo:session): session closed for user root
Jan 26 14:42:44 compute-1 sshd-session[57882]: Invalid user ubnt from 185.246.128.170 port 45427
Jan 26 14:42:44 compute-1 python3.9[58492]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Jan 26 14:42:45 compute-1 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Jan 26 14:42:45 compute-1 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Jan 26 14:42:45 compute-1 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Jan 26 14:42:45 compute-1 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Jan 26 14:42:45 compute-1 kernel: cfg80211: failed to load regulatory.db
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.3836] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58493 uid=0 result="success"
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.3852] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58493 uid=0 result="success"
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.4381] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.4382] audit: op="connection-add" uuid="f2a0ff3f-a8e5-4c69-ab74-3eb1a6eaa545" name="br-ex-br" pid=58493 uid=0 result="success"
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.4396] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.4397] audit: op="connection-add" uuid="e6f19b35-09b3-4a47-afbd-bf556d4c0e0c" name="br-ex-port" pid=58493 uid=0 result="success"
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.4409] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.4411] audit: op="connection-add" uuid="c1cf8db8-59d3-4b66-adbe-1b5207661cbe" name="eth1-port" pid=58493 uid=0 result="success"
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.4421] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.4421] audit: op="connection-add" uuid="a04292eb-faf7-4071-864b-26def163db60" name="vlan20-port" pid=58493 uid=0 result="success"
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.4432] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.4432] audit: op="connection-add" uuid="e73e4d64-e497-40c6-bf00-24abb2ea7b98" name="vlan21-port" pid=58493 uid=0 result="success"
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.4442] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.4451] audit: op="connection-add" uuid="2fe60693-f1bb-4df3-945f-48baa1893aac" name="vlan22-port" pid=58493 uid=0 result="success"
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.4468] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="ipv4.dhcp-client-id,ipv4.dhcp-timeout,ipv6.addr-gen-mode,ipv6.method,ipv6.dhcp-timeout,connection.timestamp,connection.autoconnect-priority,802-3-ethernet.mtu" pid=58493 uid=0 result="success"
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.4482] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.4483] audit: op="connection-add" uuid="5acec44e-f6bf-40e8-9b9b-5851d992de88" name="br-ex-if" pid=58493 uid=0 result="success"
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.7682] audit: op="connection-update" uuid="1467aadd-b515-5a03-83b1-dc086af911e2" name="ci-private-network" args="ipv4.addresses,ipv4.dns,ipv4.routes,ipv4.method,ipv4.routing-rules,ipv4.never-default,ipv6.addresses,ipv6.dns,ipv6.addr-gen-mode,ipv6.method,ipv6.routing-rules,ipv6.routes,connection.timestamp,connection.master,connection.controller,connection.slave-type,connection.port-type,ovs-external-ids.data,ovs-interface.type" pid=58493 uid=0 result="success"
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.7717] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.7721] audit: op="connection-add" uuid="89f241e4-29ec-4073-b6c5-9da98e5e2abc" name="vlan20-if" pid=58493 uid=0 result="success"
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.7747] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.7751] audit: op="connection-add" uuid="297de040-2c1d-41de-9548-87afdce91cff" name="vlan21-if" pid=58493 uid=0 result="success"
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.7777] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.7780] audit: op="connection-add" uuid="f8f3f9e3-c602-465a-85f0-cb4a3c132532" name="vlan22-if" pid=58493 uid=0 result="success"
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.7802] audit: op="connection-delete" uuid="6287cf3c-4986-3843-adcc-048b89f566c4" name="Wired connection 1" pid=58493 uid=0 result="success"
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.7823] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <warn>  [1769438566.7828] device (br-ex)[Open vSwitch Bridge]: error setting IPv4 forwarding to '1': Success
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.7837] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.7847] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (f2a0ff3f-a8e5-4c69-ab74-3eb1a6eaa545)
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.7848] audit: op="connection-activate" uuid="f2a0ff3f-a8e5-4c69-ab74-3eb1a6eaa545" name="br-ex-br" pid=58493 uid=0 result="success"
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.7852] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <warn>  [1769438566.7854] device (br-ex)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Success
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.7862] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.7869] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (e6f19b35-09b3-4a47-afbd-bf556d4c0e0c)
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.7872] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <warn>  [1769438566.7874] device (eth1)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.7882] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.7889] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (c1cf8db8-59d3-4b66-adbe-1b5207661cbe)
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.7893] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <warn>  [1769438566.7897] device (vlan20)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.7910] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.7921] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (a04292eb-faf7-4071-864b-26def163db60)
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.7927] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <warn>  [1769438566.7931] device (vlan21)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.7948] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.7962] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (e73e4d64-e497-40c6-bf00-24abb2ea7b98)
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.7968] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <warn>  [1769438566.7972] device (vlan22)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.7987] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8000] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (2fe60693-f1bb-4df3-945f-48baa1893aac)
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8002] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8011] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8016] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8029] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <warn>  [1769438566.8032] device (br-ex)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8040] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8050] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (5acec44e-f6bf-40e8-9b9b-5851d992de88)
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8052] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8064] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8068] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8074] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8077] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8104] device (eth1): disconnecting for new activation request.
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8106] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8110] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8112] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8114] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8118] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <warn>  [1769438566.8120] device (vlan20)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8124] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8129] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (89f241e4-29ec-4073-b6c5-9da98e5e2abc)
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8130] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8135] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8138] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8139] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8144] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <warn>  [1769438566.8145] device (vlan21)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8149] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8156] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (297de040-2c1d-41de-9548-87afdce91cff)
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8157] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8161] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8164] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8166] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8170] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <warn>  [1769438566.8171] device (vlan22)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8176] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8181] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (f8f3f9e3-c602-465a-85f0-cb4a3c132532)
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8182] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8186] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8188] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8189] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8190] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8207] audit: op="device-reapply" interface="eth0" ifindex=2 args="ipv4.dhcp-client-id,ipv4.dhcp-timeout,ipv6.addr-gen-mode,ipv6.method,connection.autoconnect-priority,802-3-ethernet.mtu" pid=58493 uid=0 result="success"
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8210] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8214] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8216] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8225] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8231] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8237] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 kernel: ovs-system: entered promiscuous mode
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8240] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8243] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8249] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8253] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8257] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8259] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8266] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 systemd-udevd[58497]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 14:42:46 compute-1 kernel: Timeout policy base is empty
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8271] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8277] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8279] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8285] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8291] dhcp4 (eth0): canceled DHCP transaction
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8292] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8292] dhcp4 (eth0): state changed no lease
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8294] dhcp4 (eth0): activation: beginning transaction (no timeout)
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8308] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8313] audit: op="device-reapply" interface="eth1" ifindex=3 pid=58493 uid=0 result="fail" reason="Device is not activated"
Jan 26 14:42:46 compute-1 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8369] dhcp4 (eth0): state changed new lease, address=38.102.83.217
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8432] device (eth1): disconnecting for new activation request.
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8436] audit: op="connection-activate" uuid="1467aadd-b515-5a03-83b1-dc086af911e2" name="ci-private-network" pid=58493 uid=0 result="success"
Jan 26 14:42:46 compute-1 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8442] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8452] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8457] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Jan 26 14:42:46 compute-1 kernel: br-ex: entered promiscuous mode
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8600] device (eth1): Activation: starting connection 'ci-private-network' (1467aadd-b515-5a03-83b1-dc086af911e2)
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8606] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8617] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8619] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8628] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8629] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58493 uid=0 result="success"
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8630] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8631] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8632] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8633] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8634] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8637] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8644] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8647] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8652] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8656] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8659] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8662] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8666] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8670] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8675] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8679] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Jan 26 14:42:46 compute-1 kernel: vlan22: entered promiscuous mode
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8683] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8687] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8693] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8696] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 systemd-udevd[58498]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8741] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8743] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8746] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8750] device (eth1): Activation: successful, device activated.
Jan 26 14:42:46 compute-1 kernel: vlan20: entered promiscuous mode
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8782] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 systemd-udevd[58588]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 14:42:46 compute-1 kernel: vlan21: entered promiscuous mode
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8824] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8842] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8849] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8854] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8859] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8867] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8868] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8872] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8909] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8927] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8941] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8948] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8955] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8960] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.8968] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.9038] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.9040] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 14:42:46 compute-1 NetworkManager[55716]: <info>  [1769438566.9045] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 26 14:42:47 compute-1 sudo[58824]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wykmqbrtsyhiainixmoedphicivpgxbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438567.5709076-566-203872732356981/AnsiballZ_async_status.py'
Jan 26 14:42:47 compute-1 sudo[58824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:42:48 compute-1 NetworkManager[55716]: <info>  [1769438568.0200] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58493 uid=0 result="success"
Jan 26 14:42:48 compute-1 NetworkManager[55716]: <info>  [1769438568.2308] checkpoint[0x55e5011e3950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Jan 26 14:42:48 compute-1 NetworkManager[55716]: <info>  [1769438568.2313] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58493 uid=0 result="success"
Jan 26 14:42:48 compute-1 python3.9[58826]: ansible-ansible.legacy.async_status Invoked with jid=j167425024556.58487 mode=status _async_dir=/root/.ansible_async
Jan 26 14:42:48 compute-1 sudo[58824]: pam_unix(sudo:session): session closed for user root
Jan 26 14:42:48 compute-1 NetworkManager[55716]: <info>  [1769438568.5008] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58493 uid=0 result="success"
Jan 26 14:42:48 compute-1 NetworkManager[55716]: <info>  [1769438568.5031] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58493 uid=0 result="success"
Jan 26 14:42:48 compute-1 NetworkManager[55716]: <info>  [1769438568.7860] audit: op="networking-control" arg="global-dns-configuration" pid=58493 uid=0 result="success"
Jan 26 14:42:48 compute-1 NetworkManager[55716]: <info>  [1769438568.7945] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Jan 26 14:42:48 compute-1 NetworkManager[55716]: <info>  [1769438568.9655] audit: op="networking-control" arg="global-dns-configuration" pid=58493 uid=0 result="success"
Jan 26 14:42:48 compute-1 NetworkManager[55716]: <info>  [1769438568.9690] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58493 uid=0 result="success"
Jan 26 14:42:49 compute-1 NetworkManager[55716]: <info>  [1769438569.1337] checkpoint[0x55e5011e3a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Jan 26 14:42:49 compute-1 NetworkManager[55716]: <info>  [1769438569.1346] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58493 uid=0 result="success"
Jan 26 14:42:49 compute-1 ansible-async_wrapper.py[58491]: Module complete (58491)
Jan 26 14:42:49 compute-1 ansible-async_wrapper.py[58490]: Done in kid B.
Jan 26 14:42:51 compute-1 sudo[58930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irajsczedkrljxaujnttfkvqrgztbyge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438567.5709076-566-203872732356981/AnsiballZ_async_status.py'
Jan 26 14:42:51 compute-1 sudo[58930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:42:51 compute-1 python3.9[58932]: ansible-ansible.legacy.async_status Invoked with jid=j167425024556.58487 mode=status _async_dir=/root/.ansible_async
Jan 26 14:42:51 compute-1 sudo[58930]: pam_unix(sudo:session): session closed for user root
Jan 26 14:42:51 compute-1 sudo[59030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgosntkkiwdacwsgzippuzjonaoychhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438567.5709076-566-203872732356981/AnsiballZ_async_status.py'
Jan 26 14:42:51 compute-1 sudo[59030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:42:52 compute-1 python3.9[59032]: ansible-ansible.legacy.async_status Invoked with jid=j167425024556.58487 mode=cleanup _async_dir=/root/.ansible_async
Jan 26 14:42:52 compute-1 sudo[59030]: pam_unix(sudo:session): session closed for user root
Jan 26 14:42:52 compute-1 sudo[59182]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tiydjmdfbwdctcyvvogmxpsvtijratws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438572.4827652-620-13528185809432/AnsiballZ_stat.py'
Jan 26 14:42:52 compute-1 sudo[59182]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:42:52 compute-1 python3.9[59184]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:42:52 compute-1 sudo[59182]: pam_unix(sudo:session): session closed for user root
Jan 26 14:42:53 compute-1 sudo[59305]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxwznqrzdhqaejrtbdywyutmiyswwwyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438572.4827652-620-13528185809432/AnsiballZ_copy.py'
Jan 26 14:42:53 compute-1 sudo[59305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:42:53 compute-1 python3.9[59307]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769438572.4827652-620-13528185809432/.source.returncode _original_basename=.6gatcxxj follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:42:53 compute-1 sudo[59305]: pam_unix(sudo:session): session closed for user root
Jan 26 14:42:53 compute-1 sshd-session[57882]: Disconnecting invalid user ubnt 185.246.128.170 port 45427: Change of username or service not allowed: (ubnt,ssh-connection) -> (git,ssh-connection) [preauth]
Jan 26 14:42:54 compute-1 sudo[59457]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwxpxqucbdsqwklmdikgrgnbeewsfwrt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438573.7965157-652-229151718044765/AnsiballZ_stat.py'
Jan 26 14:42:54 compute-1 sudo[59457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:42:54 compute-1 python3.9[59459]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:42:54 compute-1 sudo[59457]: pam_unix(sudo:session): session closed for user root
Jan 26 14:42:55 compute-1 sudo[59581]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqfilybkxqdnszngpknsbissyntyhwys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438573.7965157-652-229151718044765/AnsiballZ_copy.py'
Jan 26 14:42:55 compute-1 sudo[59581]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:42:55 compute-1 python3.9[59583]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769438573.7965157-652-229151718044765/.source.cfg _original_basename=.icxr9aka follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:42:55 compute-1 sudo[59581]: pam_unix(sudo:session): session closed for user root
Jan 26 14:42:55 compute-1 sudo[59733]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrsktkzozvjtuajfkmlebitvjwrnpowz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438575.66944-682-216511988471272/AnsiballZ_systemd.py'
Jan 26 14:42:55 compute-1 sudo[59733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:42:56 compute-1 python3.9[59735]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 14:42:56 compute-1 systemd[1]: Reloading Network Manager...
Jan 26 14:42:56 compute-1 NetworkManager[55716]: <info>  [1769438576.3766] audit: op="reload" arg="0" pid=59739 uid=0 result="success"
Jan 26 14:42:56 compute-1 NetworkManager[55716]: <info>  [1769438576.3774] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Jan 26 14:42:56 compute-1 systemd[1]: Reloaded Network Manager.
Jan 26 14:42:56 compute-1 sudo[59733]: pam_unix(sudo:session): session closed for user root
Jan 26 14:42:57 compute-1 sshd-session[51703]: Connection closed by 192.168.122.30 port 41798
Jan 26 14:42:57 compute-1 sshd-session[51700]: pam_unix(sshd:session): session closed for user zuul
Jan 26 14:42:57 compute-1 systemd-logind[795]: Session 13 logged out. Waiting for processes to exit.
Jan 26 14:42:57 compute-1 systemd[1]: session-13.scope: Deactivated successfully.
Jan 26 14:42:57 compute-1 systemd[1]: session-13.scope: Consumed 53.943s CPU time.
Jan 26 14:42:57 compute-1 systemd-logind[795]: Removed session 13.
Jan 26 14:43:03 compute-1 sshd-session[59772]: Accepted publickey for zuul from 192.168.122.30 port 60276 ssh2: ECDSA SHA256:jdvz2lyr5EyIYu+bvBKw53Euf9HIejup115smDZmKeU
Jan 26 14:43:03 compute-1 systemd-logind[795]: New session 14 of user zuul.
Jan 26 14:43:03 compute-1 systemd[1]: Started Session 14 of User zuul.
Jan 26 14:43:03 compute-1 sshd-session[59772]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 14:43:04 compute-1 python3.9[59925]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 14:43:04 compute-1 sshd-session[59768]: Invalid user git from 185.246.128.170 port 12805
Jan 26 14:43:05 compute-1 python3.9[60080]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 14:43:06 compute-1 python3.9[60269]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:43:06 compute-1 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 26 14:43:07 compute-1 sshd-session[59775]: Connection closed by 192.168.122.30 port 60276
Jan 26 14:43:07 compute-1 sshd-session[59772]: pam_unix(sshd:session): session closed for user zuul
Jan 26 14:43:07 compute-1 systemd[1]: session-14.scope: Deactivated successfully.
Jan 26 14:43:07 compute-1 systemd[1]: session-14.scope: Consumed 2.370s CPU time.
Jan 26 14:43:07 compute-1 systemd-logind[795]: Session 14 logged out. Waiting for processes to exit.
Jan 26 14:43:07 compute-1 systemd-logind[795]: Removed session 14.
Jan 26 14:43:07 compute-1 sshd-session[59768]: Disconnecting invalid user git 185.246.128.170 port 12805: Change of username or service not allowed: (git,ssh-connection) -> (fa,ssh-connection) [preauth]
Jan 26 14:43:12 compute-1 sshd-session[60299]: Accepted publickey for zuul from 192.168.122.30 port 41626 ssh2: ECDSA SHA256:jdvz2lyr5EyIYu+bvBKw53Euf9HIejup115smDZmKeU
Jan 26 14:43:12 compute-1 systemd-logind[795]: New session 15 of user zuul.
Jan 26 14:43:12 compute-1 systemd[1]: Started Session 15 of User zuul.
Jan 26 14:43:12 compute-1 sshd-session[60299]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 14:43:13 compute-1 python3.9[60452]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 14:43:14 compute-1 python3.9[60607]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 14:43:15 compute-1 sudo[60761]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aogazecjymrogbslwwirhgwjcqqckdjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438595.433432-56-13491525608532/AnsiballZ_setup.py'
Jan 26 14:43:15 compute-1 sudo[60761]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:43:15 compute-1 python3.9[60763]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 14:43:16 compute-1 sudo[60761]: pam_unix(sudo:session): session closed for user root
Jan 26 14:43:16 compute-1 sudo[60845]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ustpcimhiywfnjxlupundiwsimowlcqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438595.433432-56-13491525608532/AnsiballZ_dnf.py'
Jan 26 14:43:16 compute-1 sudo[60845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:43:16 compute-1 python3.9[60847]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 14:43:18 compute-1 sudo[60845]: pam_unix(sudo:session): session closed for user root
Jan 26 14:43:18 compute-1 sudo[61000]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfsxtfgbhigmcjsfwaeobtkfiosbvwrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438598.4674969-80-39409357905741/AnsiballZ_setup.py'
Jan 26 14:43:18 compute-1 sudo[61000]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:43:19 compute-1 python3.9[61002]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 14:43:19 compute-1 sudo[61000]: pam_unix(sudo:session): session closed for user root
Jan 26 14:43:20 compute-1 sudo[61192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjdeqrgiixevyaiqcnrmlfddmejahcha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438599.714527-102-669079233619/AnsiballZ_file.py'
Jan 26 14:43:20 compute-1 sudo[61192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:43:20 compute-1 python3.9[61194]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:43:20 compute-1 sudo[61192]: pam_unix(sudo:session): session closed for user root
Jan 26 14:43:21 compute-1 sudo[61344]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oufvzaydwwhynvvfslnavckukgsbiyzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438600.9228437-118-105606024648882/AnsiballZ_command.py'
Jan 26 14:43:21 compute-1 sudo[61344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:43:21 compute-1 python3.9[61346]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:43:21 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 14:43:21 compute-1 sudo[61344]: pam_unix(sudo:session): session closed for user root
Jan 26 14:43:22 compute-1 sudo[61508]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txhcczymximqedzhjokftqvjgbfqbuvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438601.8099864-134-138334325105046/AnsiballZ_stat.py'
Jan 26 14:43:22 compute-1 sudo[61508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:43:22 compute-1 python3.9[61510]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:43:22 compute-1 sudo[61508]: pam_unix(sudo:session): session closed for user root
Jan 26 14:43:22 compute-1 sudo[61586]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obrcgewhcqwnashrgfpvjelrorqcchjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438601.8099864-134-138334325105046/AnsiballZ_file.py'
Jan 26 14:43:22 compute-1 sudo[61586]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:43:23 compute-1 python3.9[61588]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:43:23 compute-1 sudo[61586]: pam_unix(sudo:session): session closed for user root
Jan 26 14:43:23 compute-1 sudo[61738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-urxtoqrkkxcxvkyhgmxbxmwhkciexnfe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438603.23652-158-102496239420647/AnsiballZ_stat.py'
Jan 26 14:43:23 compute-1 sudo[61738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:43:23 compute-1 python3.9[61740]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:43:23 compute-1 sudo[61738]: pam_unix(sudo:session): session closed for user root
Jan 26 14:43:24 compute-1 sudo[61816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikvloohpedwqbbupzyirrtlfdkgtozdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438603.23652-158-102496239420647/AnsiballZ_file.py'
Jan 26 14:43:24 compute-1 sudo[61816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:43:24 compute-1 python3.9[61818]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:43:24 compute-1 sudo[61816]: pam_unix(sudo:session): session closed for user root
Jan 26 14:43:24 compute-1 sudo[61968]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-stvzsxxtiwenvvqexprwhhkitzgculmh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438604.4778025-184-62117344570647/AnsiballZ_ini_file.py'
Jan 26 14:43:24 compute-1 sudo[61968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:43:25 compute-1 python3.9[61970]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:43:25 compute-1 sudo[61968]: pam_unix(sudo:session): session closed for user root
Jan 26 14:43:25 compute-1 sshd-session[60972]: Invalid user fa from 185.246.128.170 port 16307
Jan 26 14:43:25 compute-1 sudo[62120]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbcrxtsrijkfyetpcovajekperyxmskh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438605.413531-184-255209884413325/AnsiballZ_ini_file.py'
Jan 26 14:43:25 compute-1 sudo[62120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:43:25 compute-1 python3.9[62122]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:43:25 compute-1 sudo[62120]: pam_unix(sudo:session): session closed for user root
Jan 26 14:43:26 compute-1 sudo[62272]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-watozgryrlcrwsiemuyffrzbssszklzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438606.2312114-184-274991720973225/AnsiballZ_ini_file.py'
Jan 26 14:43:26 compute-1 sudo[62272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:43:26 compute-1 python3.9[62274]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:43:26 compute-1 sudo[62272]: pam_unix(sudo:session): session closed for user root
Jan 26 14:43:27 compute-1 sshd-session[60972]: Disconnecting invalid user fa 185.246.128.170 port 16307: Change of username or service not allowed: (fa,ssh-connection) -> (odoo17,ssh-connection) [preauth]
Jan 26 14:43:27 compute-1 sudo[62424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uiecitgfauulhytwjfcfngumbirhigwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438607.000388-184-193571983429799/AnsiballZ_ini_file.py'
Jan 26 14:43:27 compute-1 sudo[62424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:43:27 compute-1 python3.9[62426]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:43:27 compute-1 sudo[62424]: pam_unix(sudo:session): session closed for user root
Jan 26 14:43:28 compute-1 sudo[62576]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfnmlpahnwyylunouvjdvyxyijotkmeo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438607.938044-246-72103972746351/AnsiballZ_dnf.py'
Jan 26 14:43:28 compute-1 sudo[62576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:43:28 compute-1 python3.9[62578]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 14:43:30 compute-1 sudo[62576]: pam_unix(sudo:session): session closed for user root
Jan 26 14:43:31 compute-1 sudo[62731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfvopncsqolxzhrwcpswkfrjusukdcla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438610.7944665-268-261193482190238/AnsiballZ_setup.py'
Jan 26 14:43:31 compute-1 sudo[62731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:43:31 compute-1 python3.9[62733]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 14:43:31 compute-1 sudo[62731]: pam_unix(sudo:session): session closed for user root
Jan 26 14:43:31 compute-1 sudo[62885]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pheiyzdkkcqxqeunkjreefxlpdmqrktv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438611.7606497-284-189436757208540/AnsiballZ_stat.py'
Jan 26 14:43:31 compute-1 sudo[62885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:43:32 compute-1 python3.9[62887]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 14:43:32 compute-1 sudo[62885]: pam_unix(sudo:session): session closed for user root
Jan 26 14:43:32 compute-1 sudo[63037]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-czinzhajavhjrkmxilxsqbxnfhttyxuq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438612.5691807-302-148472861685517/AnsiballZ_stat.py'
Jan 26 14:43:32 compute-1 sudo[63037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:43:33 compute-1 python3.9[63039]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 14:43:33 compute-1 sudo[63037]: pam_unix(sudo:session): session closed for user root
Jan 26 14:43:33 compute-1 sshd-session[62580]: Invalid user odoo17 from 185.246.128.170 port 62865
Jan 26 14:43:33 compute-1 sudo[63189]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrogwyztzahdlqukkinfsbeocshuiltm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438613.5456457-322-107388590364094/AnsiballZ_command.py'
Jan 26 14:43:33 compute-1 sudo[63189]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:43:34 compute-1 python3.9[63191]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:43:34 compute-1 sudo[63189]: pam_unix(sudo:session): session closed for user root
Jan 26 14:43:34 compute-1 sudo[63342]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxwrktsmnpdywzcupewvhhrozivmfnbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438614.3866322-342-154465727298312/AnsiballZ_service_facts.py'
Jan 26 14:43:34 compute-1 sudo[63342]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:43:34 compute-1 python3.9[63344]: ansible-service_facts Invoked
Jan 26 14:43:35 compute-1 network[63361]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 26 14:43:35 compute-1 network[63362]: 'network-scripts' will be removed from distribution in near future.
Jan 26 14:43:35 compute-1 network[63363]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 26 14:43:35 compute-1 sshd-session[62580]: Disconnecting invalid user odoo17 185.246.128.170 port 62865: Change of username or service not allowed: (odoo17,ssh-connection) -> (adsl,ssh-connection) [preauth]
Jan 26 14:43:37 compute-1 sudo[63342]: pam_unix(sudo:session): session closed for user root
Jan 26 14:43:39 compute-1 sudo[63648]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjnxfapytfutmrtajieavikkkeqsmuqe ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1769438618.7907794-372-101385506317860/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1769438618.7907794-372-101385506317860/args'
Jan 26 14:43:39 compute-1 sudo[63648]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:43:39 compute-1 sudo[63648]: pam_unix(sudo:session): session closed for user root
Jan 26 14:43:39 compute-1 sshd-session[63371]: Invalid user adsl from 185.246.128.170 port 53366
Jan 26 14:43:39 compute-1 sudo[63815]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nyelxpxyxcpyoagoloabwaavxjsipctb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438619.5795445-394-276465630581939/AnsiballZ_dnf.py'
Jan 26 14:43:39 compute-1 sudo[63815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:43:40 compute-1 python3.9[63817]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 14:43:40 compute-1 sshd-session[63371]: Disconnecting invalid user adsl 185.246.128.170 port 53366: Change of username or service not allowed: (adsl,ssh-connection) -> (abc,ssh-connection) [preauth]
Jan 26 14:43:41 compute-1 sudo[63815]: pam_unix(sudo:session): session closed for user root
Jan 26 14:43:42 compute-1 sudo[63970]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxyychwacbogqauhqqgmvhntlxkyixye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438621.9311492-420-236621290103125/AnsiballZ_package_facts.py'
Jan 26 14:43:42 compute-1 sudo[63970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:43:42 compute-1 python3.9[63972]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Jan 26 14:43:43 compute-1 sudo[63970]: pam_unix(sudo:session): session closed for user root
Jan 26 14:43:44 compute-1 sudo[64122]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxprqdobldxlgoezpfzxjcvnunxpersg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438623.7674432-440-93936841938999/AnsiballZ_stat.py'
Jan 26 14:43:44 compute-1 sudo[64122]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:43:44 compute-1 python3.9[64124]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:43:44 compute-1 sudo[64122]: pam_unix(sudo:session): session closed for user root
Jan 26 14:43:44 compute-1 sudo[64247]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nnkimciyyeqjufqzsfvdtvgpulzbetdw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438623.7674432-440-93936841938999/AnsiballZ_copy.py'
Jan 26 14:43:44 compute-1 sudo[64247]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:43:45 compute-1 python3.9[64249]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769438623.7674432-440-93936841938999/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:43:45 compute-1 sudo[64247]: pam_unix(sudo:session): session closed for user root
Jan 26 14:43:45 compute-1 sshd-session[63843]: Invalid user abc from 185.246.128.170 port 30076
Jan 26 14:43:45 compute-1 sudo[64401]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnqemoudgnrefpommzsjhazazkxwkosn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438625.2489395-470-277135086842248/AnsiballZ_stat.py'
Jan 26 14:43:45 compute-1 sudo[64401]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:43:45 compute-1 python3.9[64403]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:43:45 compute-1 sudo[64401]: pam_unix(sudo:session): session closed for user root
Jan 26 14:43:46 compute-1 sshd-session[63843]: Disconnecting invalid user abc 185.246.128.170 port 30076: Change of username or service not allowed: (abc,ssh-connection) -> (nobody,ssh-connection) [preauth]
Jan 26 14:43:46 compute-1 sudo[64526]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lrcuafkskxerhmgdekfbkuehddkrprud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438625.2489395-470-277135086842248/AnsiballZ_copy.py'
Jan 26 14:43:46 compute-1 sudo[64526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:43:46 compute-1 python3.9[64528]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769438625.2489395-470-277135086842248/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:43:46 compute-1 sudo[64526]: pam_unix(sudo:session): session closed for user root
Jan 26 14:43:47 compute-1 sudo[64680]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydwmrreplwzjbbzcoiwtddigzpekjpxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438627.2076328-512-4462700506105/AnsiballZ_lineinfile.py'
Jan 26 14:43:47 compute-1 sudo[64680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:43:47 compute-1 python3.9[64682]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:43:47 compute-1 sudo[64680]: pam_unix(sudo:session): session closed for user root
Jan 26 14:43:48 compute-1 sudo[64835]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dcjahhpstwivpqnyehzsgqjcqfdyeucl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438628.6486237-543-214416440825965/AnsiballZ_setup.py'
Jan 26 14:43:48 compute-1 sudo[64835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:43:49 compute-1 python3.9[64837]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 14:43:49 compute-1 sudo[64835]: pam_unix(sudo:session): session closed for user root
Jan 26 14:43:49 compute-1 sudo[64920]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izadqfytnruzlytdqlnoxcbkpihtpcrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438628.6486237-543-214416440825965/AnsiballZ_systemd.py'
Jan 26 14:43:49 compute-1 sudo[64920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:43:50 compute-1 python3.9[64922]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 14:43:50 compute-1 sudo[64920]: pam_unix(sudo:session): session closed for user root
Jan 26 14:43:51 compute-1 sudo[65074]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mutjwptoadnlngldfshukknozpgeutit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438631.1192188-575-91376192187310/AnsiballZ_setup.py'
Jan 26 14:43:51 compute-1 sudo[65074]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:43:51 compute-1 python3.9[65076]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 14:43:51 compute-1 sudo[65074]: pam_unix(sudo:session): session closed for user root
Jan 26 14:43:52 compute-1 sudo[65158]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nnqtxjnczbfykjmvyfkpachhnhhbkgvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438631.1192188-575-91376192187310/AnsiballZ_systemd.py'
Jan 26 14:43:52 compute-1 sudo[65158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:43:52 compute-1 python3.9[65160]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 14:43:52 compute-1 chronyd[792]: chronyd exiting
Jan 26 14:43:52 compute-1 systemd[1]: Stopping NTP client/server...
Jan 26 14:43:52 compute-1 systemd[1]: chronyd.service: Deactivated successfully.
Jan 26 14:43:52 compute-1 systemd[1]: Stopped NTP client/server.
Jan 26 14:43:52 compute-1 systemd[1]: Starting NTP client/server...
Jan 26 14:43:52 compute-1 chronyd[65168]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 26 14:43:52 compute-1 chronyd[65168]: Frequency -26.422 +/- 0.135 ppm read from /var/lib/chrony/drift
Jan 26 14:43:52 compute-1 chronyd[65168]: Loaded seccomp filter (level 2)
Jan 26 14:43:52 compute-1 systemd[1]: Started NTP client/server.
Jan 26 14:43:52 compute-1 sudo[65158]: pam_unix(sudo:session): session closed for user root
Jan 26 14:43:53 compute-1 sshd-session[60302]: Connection closed by 192.168.122.30 port 41626
Jan 26 14:43:53 compute-1 sshd-session[60299]: pam_unix(sshd:session): session closed for user zuul
Jan 26 14:43:53 compute-1 systemd[1]: session-15.scope: Deactivated successfully.
Jan 26 14:43:53 compute-1 systemd[1]: session-15.scope: Consumed 25.675s CPU time.
Jan 26 14:43:53 compute-1 systemd-logind[795]: Session 15 logged out. Waiting for processes to exit.
Jan 26 14:43:53 compute-1 systemd-logind[795]: Removed session 15.
Jan 26 14:43:58 compute-1 sshd-session[65194]: Accepted publickey for zuul from 192.168.122.30 port 51194 ssh2: ECDSA SHA256:jdvz2lyr5EyIYu+bvBKw53Euf9HIejup115smDZmKeU
Jan 26 14:43:58 compute-1 systemd-logind[795]: New session 16 of user zuul.
Jan 26 14:43:58 compute-1 systemd[1]: Started Session 16 of User zuul.
Jan 26 14:43:58 compute-1 sshd-session[65194]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 14:43:59 compute-1 sshd-session[64709]: Disconnecting authenticating user nobody 185.246.128.170 port 20082: Change of username or service not allowed: (nobody,ssh-connection) -> (admin2,ssh-connection) [preauth]
Jan 26 14:43:59 compute-1 python3.9[65347]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 14:44:00 compute-1 sudo[65501]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvkeztxrhmhgobbbmrzquoxdfqkvbszq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438640.0506954-42-179326793276958/AnsiballZ_file.py'
Jan 26 14:44:00 compute-1 sudo[65501]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:44:00 compute-1 python3.9[65503]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:44:00 compute-1 sudo[65501]: pam_unix(sudo:session): session closed for user root
Jan 26 14:44:01 compute-1 sudo[65676]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xizsascusgmgolqytofnwdxumzcqqgzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438641.0166054-58-23164415226864/AnsiballZ_stat.py'
Jan 26 14:44:01 compute-1 sudo[65676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:44:01 compute-1 python3.9[65678]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:44:01 compute-1 sudo[65676]: pam_unix(sudo:session): session closed for user root
Jan 26 14:44:01 compute-1 sudo[65754]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mzqyuirbsgrviecovvtxfnjrzymjfggl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438641.0166054-58-23164415226864/AnsiballZ_file.py'
Jan 26 14:44:01 compute-1 sudo[65754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:44:02 compute-1 python3.9[65756]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.fdszav66 recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:44:02 compute-1 sudo[65754]: pam_unix(sudo:session): session closed for user root
Jan 26 14:44:02 compute-1 sudo[65906]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-icuxpphrzlgyunbuuobjzawfvjjdobyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438642.6512992-98-237103424856235/AnsiballZ_stat.py'
Jan 26 14:44:02 compute-1 sudo[65906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:44:03 compute-1 python3.9[65908]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:44:03 compute-1 sudo[65906]: pam_unix(sudo:session): session closed for user root
Jan 26 14:44:03 compute-1 sudo[66029]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fysgtvhodzwydccmkymyvzzrepbmunzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438642.6512992-98-237103424856235/AnsiballZ_copy.py'
Jan 26 14:44:03 compute-1 sudo[66029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:44:03 compute-1 python3.9[66031]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769438642.6512992-98-237103424856235/.source _original_basename=.z2debj8r follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:44:03 compute-1 sudo[66029]: pam_unix(sudo:session): session closed for user root
Jan 26 14:44:04 compute-1 irqbalance[790]: Cannot change IRQ 26 affinity: Operation not permitted
Jan 26 14:44:04 compute-1 irqbalance[790]: IRQ 26 affinity is now unmanaged
Jan 26 14:44:04 compute-1 sudo[66181]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djtwyvxeqgcvzoitzkkimystcwpmmgkj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438644.083881-130-49291446800115/AnsiballZ_file.py'
Jan 26 14:44:04 compute-1 sudo[66181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:44:04 compute-1 python3.9[66183]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:44:04 compute-1 sudo[66181]: pam_unix(sudo:session): session closed for user root
Jan 26 14:44:05 compute-1 sudo[66333]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbcfcjanekisltadtbzzzwmmzpaleetj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438644.784579-146-225626700655895/AnsiballZ_stat.py'
Jan 26 14:44:05 compute-1 sudo[66333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:44:05 compute-1 python3.9[66335]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:44:05 compute-1 sudo[66333]: pam_unix(sudo:session): session closed for user root
Jan 26 14:44:05 compute-1 sudo[66456]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-edzojmzskaywdivzdgntrsqgrtdfflbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438644.784579-146-225626700655895/AnsiballZ_copy.py'
Jan 26 14:44:05 compute-1 sudo[66456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:44:05 compute-1 python3.9[66458]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769438644.784579-146-225626700655895/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:44:05 compute-1 sudo[66456]: pam_unix(sudo:session): session closed for user root
Jan 26 14:44:06 compute-1 sudo[66608]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqjihfveanrwdwhkqiqdeozemlbcoejs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438646.035996-146-8164912580760/AnsiballZ_stat.py'
Jan 26 14:44:06 compute-1 sudo[66608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:44:06 compute-1 python3.9[66610]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:44:06 compute-1 sudo[66608]: pam_unix(sudo:session): session closed for user root
Jan 26 14:44:06 compute-1 sudo[66731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upiqcjyfbzcxuxlqhorctjhbehqrdljc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438646.035996-146-8164912580760/AnsiballZ_copy.py'
Jan 26 14:44:06 compute-1 sudo[66731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:44:07 compute-1 python3.9[66733]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769438646.035996-146-8164912580760/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:44:07 compute-1 sudo[66731]: pam_unix(sudo:session): session closed for user root
Jan 26 14:44:07 compute-1 sudo[66883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onejvtbqwgdbllcxfwyeztswmbtjburm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438647.3096697-204-272461698469653/AnsiballZ_file.py'
Jan 26 14:44:07 compute-1 sudo[66883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:44:07 compute-1 python3.9[66885]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:44:07 compute-1 sudo[66883]: pam_unix(sudo:session): session closed for user root
Jan 26 14:44:08 compute-1 sudo[67035]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxczxdsfbvbzmjiedjyrqleueojkhjtj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438647.9946404-220-21693181823757/AnsiballZ_stat.py'
Jan 26 14:44:08 compute-1 sudo[67035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:44:08 compute-1 python3.9[67037]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:44:08 compute-1 sudo[67035]: pam_unix(sudo:session): session closed for user root
Jan 26 14:44:08 compute-1 sudo[67158]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcqmlagovpqchhwoslkujukmzhnzlsjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438647.9946404-220-21693181823757/AnsiballZ_copy.py'
Jan 26 14:44:08 compute-1 sudo[67158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:44:09 compute-1 python3.9[67160]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769438647.9946404-220-21693181823757/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:44:09 compute-1 sudo[67158]: pam_unix(sudo:session): session closed for user root
Jan 26 14:44:09 compute-1 sudo[67310]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwfwcjoxopezdgwkrniqdyfespjkltts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438649.1817412-250-106165102774496/AnsiballZ_stat.py'
Jan 26 14:44:09 compute-1 sudo[67310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:44:09 compute-1 python3.9[67312]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:44:09 compute-1 sudo[67310]: pam_unix(sudo:session): session closed for user root
Jan 26 14:44:09 compute-1 sudo[67434]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktawemomnpmuqrzwnfycwnbjgupksdoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438649.1817412-250-106165102774496/AnsiballZ_copy.py'
Jan 26 14:44:09 compute-1 sudo[67434]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:44:10 compute-1 python3.9[67436]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769438649.1817412-250-106165102774496/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:44:10 compute-1 sudo[67434]: pam_unix(sudo:session): session closed for user root
Jan 26 14:44:11 compute-1 sudo[67586]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avjmlcrvgnrkjhqqwzhzfpndubcbcqer ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438650.5958781-280-17314276665189/AnsiballZ_systemd.py'
Jan 26 14:44:11 compute-1 sudo[67586]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:44:11 compute-1 python3.9[67588]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 14:44:11 compute-1 systemd[1]: Reloading.
Jan 26 14:44:11 compute-1 systemd-rc-local-generator[67612]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:44:11 compute-1 systemd-sysv-generator[67617]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 14:44:12 compute-1 systemd[1]: Reloading.
Jan 26 14:44:12 compute-1 systemd-rc-local-generator[67650]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:44:12 compute-1 systemd-sysv-generator[67655]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 14:44:12 compute-1 systemd[1]: Starting EDPM Container Shutdown...
Jan 26 14:44:12 compute-1 systemd[1]: Finished EDPM Container Shutdown.
Jan 26 14:44:12 compute-1 sudo[67586]: pam_unix(sudo:session): session closed for user root
Jan 26 14:44:13 compute-1 sudo[67815]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zeyeofgvzyqrkbaxmwqdfhgkbwmhfbxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438652.7892435-296-239976659631874/AnsiballZ_stat.py'
Jan 26 14:44:13 compute-1 sudo[67815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:44:13 compute-1 python3.9[67817]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:44:13 compute-1 sudo[67815]: pam_unix(sudo:session): session closed for user root
Jan 26 14:44:13 compute-1 sudo[67938]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjohofowhtdwtfslhvwzlhxdsvirfifl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438652.7892435-296-239976659631874/AnsiballZ_copy.py'
Jan 26 14:44:13 compute-1 sudo[67938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:44:13 compute-1 python3.9[67940]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769438652.7892435-296-239976659631874/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:44:13 compute-1 sudo[67938]: pam_unix(sudo:session): session closed for user root
Jan 26 14:44:14 compute-1 sudo[68090]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpqrzvxnbabtrqzoxevimwkardwlaqzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438653.9607832-326-6133358047818/AnsiballZ_stat.py'
Jan 26 14:44:14 compute-1 sudo[68090]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:44:14 compute-1 python3.9[68092]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:44:14 compute-1 sudo[68090]: pam_unix(sudo:session): session closed for user root
Jan 26 14:44:14 compute-1 sudo[68213]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qjnhbhnrqpvslsjqczjxdglljgkamots ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438653.9607832-326-6133358047818/AnsiballZ_copy.py'
Jan 26 14:44:14 compute-1 sudo[68213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:44:14 compute-1 python3.9[68215]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769438653.9607832-326-6133358047818/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:44:14 compute-1 sudo[68213]: pam_unix(sudo:session): session closed for user root
Jan 26 14:44:15 compute-1 sudo[68365]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pduobmercktfsxwjccramjjyvxzlmcja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438655.220011-356-31197688733289/AnsiballZ_systemd.py'
Jan 26 14:44:15 compute-1 sudo[68365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:44:15 compute-1 python3.9[68367]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 14:44:15 compute-1 systemd[1]: Reloading.
Jan 26 14:44:15 compute-1 systemd-rc-local-generator[68394]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:44:15 compute-1 systemd-sysv-generator[68398]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 14:44:16 compute-1 systemd[1]: Reloading.
Jan 26 14:44:16 compute-1 systemd-sysv-generator[68434]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 14:44:16 compute-1 systemd-rc-local-generator[68430]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:44:16 compute-1 systemd[1]: Starting Create netns directory...
Jan 26 14:44:16 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 26 14:44:16 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 26 14:44:16 compute-1 systemd[1]: Finished Create netns directory.
Jan 26 14:44:16 compute-1 sudo[68365]: pam_unix(sudo:session): session closed for user root
Jan 26 14:44:17 compute-1 sshd-session[67336]: Invalid user admin2 from 185.246.128.170 port 40190
Jan 26 14:44:17 compute-1 python3.9[68592]: ansible-ansible.builtin.service_facts Invoked
Jan 26 14:44:17 compute-1 network[68609]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 26 14:44:17 compute-1 network[68610]: 'network-scripts' will be removed from distribution in near future.
Jan 26 14:44:17 compute-1 network[68611]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 26 14:44:18 compute-1 sshd-session[67336]: Disconnecting invalid user admin2 185.246.128.170 port 40190: Change of username or service not allowed: (admin2,ssh-connection) -> (sshd,ssh-connection) [preauth]
Jan 26 14:44:24 compute-1 sudo[68872]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zlswsjyuhyawmtpksezoziouhtytkdwi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438664.6552775-388-109312376974266/AnsiballZ_systemd.py'
Jan 26 14:44:24 compute-1 sudo[68872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:44:25 compute-1 python3.9[68874]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 14:44:25 compute-1 systemd[1]: Reloading.
Jan 26 14:44:25 compute-1 systemd-rc-local-generator[68903]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:44:25 compute-1 systemd-sysv-generator[68907]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 14:44:25 compute-1 systemd[1]: Stopping IPv4 firewall with iptables...
Jan 26 14:44:25 compute-1 iptables.init[68914]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Jan 26 14:44:25 compute-1 iptables.init[68914]: iptables: Flushing firewall rules: [  OK  ]
Jan 26 14:44:25 compute-1 systemd[1]: iptables.service: Deactivated successfully.
Jan 26 14:44:25 compute-1 systemd[1]: Stopped IPv4 firewall with iptables.
Jan 26 14:44:25 compute-1 sudo[68872]: pam_unix(sudo:session): session closed for user root
Jan 26 14:44:26 compute-1 sudo[69108]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmksoxrssvxzlqfmvxzvtvlrvycfxdfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438666.0185528-388-162325128482951/AnsiballZ_systemd.py'
Jan 26 14:44:26 compute-1 sudo[69108]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:44:26 compute-1 python3.9[69110]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 14:44:26 compute-1 sudo[69108]: pam_unix(sudo:session): session closed for user root
Jan 26 14:44:27 compute-1 sudo[69263]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvpstjowwkljyoqaozzrysfmyyvdjuwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438667.0388732-420-201608995742342/AnsiballZ_systemd.py'
Jan 26 14:44:27 compute-1 sudo[69263]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:44:27 compute-1 python3.9[69265]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 14:44:27 compute-1 systemd[1]: Reloading.
Jan 26 14:44:27 compute-1 systemd-rc-local-generator[69294]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:44:27 compute-1 systemd-sysv-generator[69298]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 14:44:29 compute-1 systemd[1]: Starting Netfilter Tables...
Jan 26 14:44:29 compute-1 systemd[1]: Finished Netfilter Tables.
Jan 26 14:44:29 compute-1 sudo[69263]: pam_unix(sudo:session): session closed for user root
Jan 26 14:44:29 compute-1 sudo[69454]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lnzaodrmdtiuysroppvlpjdasyclbxbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438669.6143181-436-209668839676527/AnsiballZ_command.py'
Jan 26 14:44:29 compute-1 sudo[69454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:44:30 compute-1 sshd-session[68746]: Disconnecting authenticating user sshd 185.246.128.170 port 10852: Change of username or service not allowed: (sshd,ssh-connection) -> (aman,ssh-connection) [preauth]
Jan 26 14:44:30 compute-1 python3.9[69456]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:44:30 compute-1 sudo[69454]: pam_unix(sudo:session): session closed for user root
Jan 26 14:44:31 compute-1 sudo[69607]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-alrbdawhqcdtlrlodgbhtpnnunjxjrxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438670.7816076-464-147355213965227/AnsiballZ_stat.py'
Jan 26 14:44:31 compute-1 sudo[69607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:44:31 compute-1 python3.9[69609]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:44:31 compute-1 sudo[69607]: pam_unix(sudo:session): session closed for user root
Jan 26 14:44:31 compute-1 sudo[69732]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwxkswyozkyptobgrddoenpeljhpnfrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438670.7816076-464-147355213965227/AnsiballZ_copy.py'
Jan 26 14:44:31 compute-1 sudo[69732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:44:31 compute-1 python3.9[69734]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769438670.7816076-464-147355213965227/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:44:31 compute-1 sudo[69732]: pam_unix(sudo:session): session closed for user root
Jan 26 14:44:32 compute-1 sudo[69885]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmqpyoepshiijlmdcavpdgpqqkwtszll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438672.2115734-494-213870196938702/AnsiballZ_systemd.py'
Jan 26 14:44:32 compute-1 sudo[69885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:44:32 compute-1 python3.9[69887]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 14:44:32 compute-1 systemd[1]: Reloading OpenSSH server daemon...
Jan 26 14:44:32 compute-1 sshd[1009]: Received SIGHUP; restarting.
Jan 26 14:44:32 compute-1 sshd[1009]: Server listening on 0.0.0.0 port 22.
Jan 26 14:44:32 compute-1 sshd[1009]: Server listening on :: port 22.
Jan 26 14:44:32 compute-1 systemd[1]: Reloaded OpenSSH server daemon.
Jan 26 14:44:32 compute-1 sudo[69885]: pam_unix(sudo:session): session closed for user root
Jan 26 14:44:33 compute-1 sudo[70042]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xllbitnqcxjykxtljvkptudfcjboxcpi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438673.1018062-510-13062490874871/AnsiballZ_file.py'
Jan 26 14:44:33 compute-1 sudo[70042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:44:33 compute-1 python3.9[70044]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:44:33 compute-1 sudo[70042]: pam_unix(sudo:session): session closed for user root
Jan 26 14:44:34 compute-1 sudo[70194]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjwxzmfspovkmsucaapzoertztrehzvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438673.7744703-526-92949562848632/AnsiballZ_stat.py'
Jan 26 14:44:34 compute-1 sudo[70194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:44:34 compute-1 python3.9[70196]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:44:34 compute-1 sudo[70194]: pam_unix(sudo:session): session closed for user root
Jan 26 14:44:34 compute-1 sudo[70317]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdkawtbdenmikgxxbgxeepcxdwnzaokt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438673.7744703-526-92949562848632/AnsiballZ_copy.py'
Jan 26 14:44:34 compute-1 sudo[70317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:44:34 compute-1 python3.9[70319]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769438673.7744703-526-92949562848632/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:44:34 compute-1 sudo[70317]: pam_unix(sudo:session): session closed for user root
Jan 26 14:44:35 compute-1 sudo[70470]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvyhvttqsmqneylzouzqnxeifomuvttf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438675.0761068-562-230951284925961/AnsiballZ_timezone.py'
Jan 26 14:44:35 compute-1 sudo[70470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:44:35 compute-1 python3.9[70472]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 26 14:44:35 compute-1 systemd[1]: Starting Time & Date Service...
Jan 26 14:44:35 compute-1 systemd[1]: Started Time & Date Service.
Jan 26 14:44:35 compute-1 sudo[70470]: pam_unix(sudo:session): session closed for user root
Jan 26 14:44:36 compute-1 sudo[70626]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkndsrsnlodsnjkahavmgfbxrcbvlqng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438676.0609596-580-107754670927528/AnsiballZ_file.py'
Jan 26 14:44:36 compute-1 sudo[70626]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:44:36 compute-1 python3.9[70628]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:44:36 compute-1 sudo[70626]: pam_unix(sudo:session): session closed for user root
Jan 26 14:44:37 compute-1 sudo[70778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfriuxqfggzgcwfcqsnkyfehgcxvtktd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438676.7756567-596-268637598407867/AnsiballZ_stat.py'
Jan 26 14:44:37 compute-1 sudo[70778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:44:37 compute-1 python3.9[70780]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:44:37 compute-1 sudo[70778]: pam_unix(sudo:session): session closed for user root
Jan 26 14:44:37 compute-1 sudo[70901]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-juupuiofsvbpwoxkactnbmsreerjinhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438676.7756567-596-268637598407867/AnsiballZ_copy.py'
Jan 26 14:44:37 compute-1 sudo[70901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:44:37 compute-1 python3.9[70903]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769438676.7756567-596-268637598407867/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:44:37 compute-1 sudo[70901]: pam_unix(sudo:session): session closed for user root
Jan 26 14:44:38 compute-1 sudo[71053]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjwkfnlpccgvcplbbkiloimankvbmrjs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438677.988719-626-251765704614469/AnsiballZ_stat.py'
Jan 26 14:44:38 compute-1 sudo[71053]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:44:38 compute-1 sshd-session[69916]: Invalid user aman from 185.246.128.170 port 12564
Jan 26 14:44:38 compute-1 python3.9[71055]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:44:38 compute-1 sudo[71053]: pam_unix(sudo:session): session closed for user root
Jan 26 14:44:38 compute-1 sudo[71176]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rniswydqnftwpxizgjldjturkoourghm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438677.988719-626-251765704614469/AnsiballZ_copy.py'
Jan 26 14:44:38 compute-1 sudo[71176]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:44:39 compute-1 python3.9[71178]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769438677.988719-626-251765704614469/.source.yaml _original_basename=.21ucux8h follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:44:39 compute-1 sudo[71176]: pam_unix(sudo:session): session closed for user root
Jan 26 14:44:39 compute-1 sudo[71328]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkctsuslqztwcevmylrxsjnswvgxhlxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438679.2835107-656-104975093476740/AnsiballZ_stat.py'
Jan 26 14:44:39 compute-1 sudo[71328]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:44:39 compute-1 python3.9[71330]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:44:39 compute-1 sudo[71328]: pam_unix(sudo:session): session closed for user root
Jan 26 14:44:40 compute-1 sudo[71451]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rriqwyzyiedrqoxpmoqbtuvkpvgxccoy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438679.2835107-656-104975093476740/AnsiballZ_copy.py'
Jan 26 14:44:40 compute-1 sudo[71451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:44:40 compute-1 python3.9[71453]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769438679.2835107-656-104975093476740/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:44:40 compute-1 sudo[71451]: pam_unix(sudo:session): session closed for user root
Jan 26 14:44:40 compute-1 sudo[71603]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-huytxbuxdylixjsafttxkndijlqhuqcv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438680.5246468-686-142026699338399/AnsiballZ_command.py'
Jan 26 14:44:40 compute-1 sudo[71603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:44:41 compute-1 python3.9[71605]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:44:41 compute-1 sudo[71603]: pam_unix(sudo:session): session closed for user root
Jan 26 14:44:41 compute-1 sudo[71756]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmgoyuttowupkgfkcgdoqimdudqicguw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438681.2508256-702-93454135482197/AnsiballZ_command.py'
Jan 26 14:44:41 compute-1 sudo[71756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:44:41 compute-1 python3.9[71758]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:44:41 compute-1 sudo[71756]: pam_unix(sudo:session): session closed for user root
Jan 26 14:44:42 compute-1 sudo[71909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tctemdgzizugyexpljsgqwqqdjzqrqil ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769438682.013963-719-45834975997630/AnsiballZ_edpm_nftables_from_files.py'
Jan 26 14:44:42 compute-1 sudo[71909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:44:42 compute-1 python3[71911]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 26 14:44:42 compute-1 sudo[71909]: pam_unix(sudo:session): session closed for user root
Jan 26 14:44:42 compute-1 sshd-session[69916]: Disconnecting invalid user aman 185.246.128.170 port 12564: Change of username or service not allowed: (aman,ssh-connection) -> (root,ssh-connection) [preauth]
Jan 26 14:44:43 compute-1 sudo[72061]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubkhopqncbtwlbvwokbkoecceoynnwxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438682.9160123-735-235931845877641/AnsiballZ_stat.py'
Jan 26 14:44:43 compute-1 sudo[72061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:44:43 compute-1 python3.9[72063]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:44:43 compute-1 sudo[72061]: pam_unix(sudo:session): session closed for user root
Jan 26 14:44:43 compute-1 sudo[72184]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-paciccggxwlpmmmbqjwsqjcxerlrbpbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438682.9160123-735-235931845877641/AnsiballZ_copy.py'
Jan 26 14:44:43 compute-1 sudo[72184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:44:43 compute-1 python3.9[72186]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769438682.9160123-735-235931845877641/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:44:43 compute-1 sudo[72184]: pam_unix(sudo:session): session closed for user root
Jan 26 14:44:44 compute-1 sudo[72336]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ecpsekumhbjmhapaitkhcshqxzgnedjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438684.4635942-764-108430296920649/AnsiballZ_stat.py'
Jan 26 14:44:44 compute-1 sudo[72336]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:44:45 compute-1 python3.9[72338]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:44:45 compute-1 sudo[72336]: pam_unix(sudo:session): session closed for user root
Jan 26 14:44:45 compute-1 sudo[72459]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwclskcapdfkwctfpshalyhegzhgasvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438684.4635942-764-108430296920649/AnsiballZ_copy.py'
Jan 26 14:44:45 compute-1 sudo[72459]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:44:45 compute-1 python3.9[72461]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769438684.4635942-764-108430296920649/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:44:45 compute-1 sudo[72459]: pam_unix(sudo:session): session closed for user root
Jan 26 14:44:46 compute-1 sudo[72612]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdldianoxqivktltkkvbqsleobnsvnzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438685.8382819-794-206812089602814/AnsiballZ_stat.py'
Jan 26 14:44:46 compute-1 sudo[72612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:44:46 compute-1 python3.9[72614]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:44:46 compute-1 sudo[72612]: pam_unix(sudo:session): session closed for user root
Jan 26 14:44:46 compute-1 sudo[72735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zceopukaalzblgvwudnjibwkgzhjkitr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438685.8382819-794-206812089602814/AnsiballZ_copy.py'
Jan 26 14:44:46 compute-1 sudo[72735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:44:46 compute-1 python3.9[72737]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769438685.8382819-794-206812089602814/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:44:46 compute-1 sudo[72735]: pam_unix(sudo:session): session closed for user root
Jan 26 14:44:47 compute-1 sudo[72888]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivtvbxewlvtqzeguoohmeqlnyfrzibuf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438687.0096908-824-62835596031869/AnsiballZ_stat.py'
Jan 26 14:44:47 compute-1 sudo[72888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:44:47 compute-1 python3.9[72890]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:44:47 compute-1 sudo[72888]: pam_unix(sudo:session): session closed for user root
Jan 26 14:44:47 compute-1 sudo[73011]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djkaxiyenjcevwmtzxtfbjdnbswatsvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438687.0096908-824-62835596031869/AnsiballZ_copy.py'
Jan 26 14:44:47 compute-1 sudo[73011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:44:48 compute-1 python3.9[73013]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769438687.0096908-824-62835596031869/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:44:48 compute-1 sudo[73011]: pam_unix(sudo:session): session closed for user root
Jan 26 14:44:48 compute-1 sudo[73163]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-robtpknwkocsvgttlupxeobhlgjribom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438688.2885244-854-21616978736748/AnsiballZ_stat.py'
Jan 26 14:44:48 compute-1 sudo[73163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:44:49 compute-1 python3.9[73165]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:44:49 compute-1 sudo[73163]: pam_unix(sudo:session): session closed for user root
Jan 26 14:44:49 compute-1 sudo[73286]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upyihkiondqijccdodkblftfzibeebhp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438688.2885244-854-21616978736748/AnsiballZ_copy.py'
Jan 26 14:44:49 compute-1 sudo[73286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:44:49 compute-1 python3.9[73288]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769438688.2885244-854-21616978736748/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:44:49 compute-1 sudo[73286]: pam_unix(sudo:session): session closed for user root
Jan 26 14:44:50 compute-1 sudo[73438]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvabujoyvmlavjtbvjqpqkiuhpazysmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438689.7879694-884-117262719575447/AnsiballZ_file.py'
Jan 26 14:44:50 compute-1 sudo[73438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:44:50 compute-1 python3.9[73440]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:44:50 compute-1 sudo[73438]: pam_unix(sudo:session): session closed for user root
Jan 26 14:44:50 compute-1 sudo[73590]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwqrtvuquqcqnlbiwjajvkdeyvtirwai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438690.5731018-901-207202909586486/AnsiballZ_command.py'
Jan 26 14:44:50 compute-1 sudo[73590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:44:51 compute-1 python3.9[73592]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:44:51 compute-1 sudo[73590]: pam_unix(sudo:session): session closed for user root
Jan 26 14:44:51 compute-1 sudo[73749]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdnsfmknpbvhktenpcmjdsazqnmzylme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438691.4147046-916-271549626374416/AnsiballZ_blockinfile.py'
Jan 26 14:44:51 compute-1 sudo[73749]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:44:52 compute-1 python3.9[73751]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:44:52 compute-1 sudo[73749]: pam_unix(sudo:session): session closed for user root
Jan 26 14:44:52 compute-1 sudo[73902]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-squtimlaccngtryuncwbyuboivlfogkv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438692.3404696-934-99253185760286/AnsiballZ_file.py'
Jan 26 14:44:52 compute-1 sudo[73902]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:44:52 compute-1 python3.9[73904]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:44:52 compute-1 sudo[73902]: pam_unix(sudo:session): session closed for user root
Jan 26 14:44:53 compute-1 sudo[74054]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcddwrtmzvrewxddjjqzafshlfsuqean ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438693.1091785-934-178598373626481/AnsiballZ_file.py'
Jan 26 14:44:53 compute-1 sudo[74054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:44:53 compute-1 python3.9[74056]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:44:53 compute-1 sudo[74054]: pam_unix(sudo:session): session closed for user root
Jan 26 14:44:54 compute-1 sudo[74206]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nahkzzvhrrhgqysdavljgtvxvdqafchz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438693.7689378-964-132768337061808/AnsiballZ_mount.py'
Jan 26 14:44:54 compute-1 sudo[74206]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:44:54 compute-1 python3.9[74208]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 26 14:44:54 compute-1 sudo[74206]: pam_unix(sudo:session): session closed for user root
Jan 26 14:44:54 compute-1 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 14:44:54 compute-1 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 14:44:54 compute-1 sudo[74360]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-idekcdmacwhfqlnihlkdnjfusleyckra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438694.660258-964-33150037569040/AnsiballZ_mount.py'
Jan 26 14:44:54 compute-1 sudo[74360]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:44:55 compute-1 python3.9[74362]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 26 14:44:55 compute-1 sudo[74360]: pam_unix(sudo:session): session closed for user root
Jan 26 14:44:55 compute-1 sshd-session[65197]: Connection closed by 192.168.122.30 port 51194
Jan 26 14:44:55 compute-1 sshd-session[65194]: pam_unix(sshd:session): session closed for user zuul
Jan 26 14:44:55 compute-1 systemd[1]: session-16.scope: Deactivated successfully.
Jan 26 14:44:55 compute-1 systemd[1]: session-16.scope: Consumed 35.813s CPU time.
Jan 26 14:44:55 compute-1 systemd-logind[795]: Session 16 logged out. Waiting for processes to exit.
Jan 26 14:44:55 compute-1 systemd-logind[795]: Removed session 16.
Jan 26 14:45:02 compute-1 sshd-session[74388]: Accepted publickey for zuul from 192.168.122.30 port 59198 ssh2: ECDSA SHA256:jdvz2lyr5EyIYu+bvBKw53Euf9HIejup115smDZmKeU
Jan 26 14:45:02 compute-1 systemd-logind[795]: New session 17 of user zuul.
Jan 26 14:45:02 compute-1 systemd[1]: Started Session 17 of User zuul.
Jan 26 14:45:02 compute-1 sshd-session[74388]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 14:45:02 compute-1 sudo[74541]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imcfkprkpvxqntmjedopkoqrqeueaiys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438702.4773767-18-223741088306202/AnsiballZ_tempfile.py'
Jan 26 14:45:02 compute-1 sudo[74541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:45:03 compute-1 python3.9[74543]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Jan 26 14:45:03 compute-1 sudo[74541]: pam_unix(sudo:session): session closed for user root
Jan 26 14:45:03 compute-1 sudo[74693]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhuxytxjxhpoxjyzpciwtqjvpopxphkc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438703.317769-42-128690283832504/AnsiballZ_stat.py'
Jan 26 14:45:03 compute-1 sudo[74693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:45:04 compute-1 python3.9[74695]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 14:45:04 compute-1 sudo[74693]: pam_unix(sudo:session): session closed for user root
Jan 26 14:45:05 compute-1 sudo[74845]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-olrilyjehtcnjrvzxissrlildtrbrpye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438704.2495644-62-92975468778451/AnsiballZ_setup.py'
Jan 26 14:45:05 compute-1 sudo[74845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:45:05 compute-1 python3.9[74847]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 14:45:05 compute-1 sudo[74845]: pam_unix(sudo:session): session closed for user root
Jan 26 14:45:05 compute-1 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 26 14:45:05 compute-1 sudo[75000]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhpqgdfjbmmikfdypoagcxksdskcywst ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438705.5612-79-131632108267253/AnsiballZ_blockinfile.py'
Jan 26 14:45:05 compute-1 sudo[75000]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:45:06 compute-1 python3.9[75002]: ansible-ansible.builtin.blockinfile Invoked with block=compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC0j2U6UG4lwLZ9/fJbKsQ5QsHQc3J3/uDmRrymfHTAL9j5bRz+gfNjEMAPInSzyqsaNpOmjIbj9TbHipI3W4IssATbrA31qo1ZNaHELvA6maOBDpqkHPJ7HuIP/OE8al0psg2tIqzGLq/H2t/K+x4DIWAdTExffm9ebGpNRHNfK6wCvJCXrVe67TrcHCge0MwZ+VLXlWNrzp1xIkvpoyRxQGalCprSncPdUyNBRAW+geCe1UnSGZLBkP727q2yH6uLMS+Fu2kW40onU61db2MKW8FW6jmqA8QBHAm4laAT0kUvW5Xl5w/nYzFsqSbqQ2tHnRj/MahVT/a6GieE022nfdy+KtE8OHtt4yvlpZ/J77j0XlLCc6/FRpqTBiGe83/W7/YKfcV/tI6iXu8om8GATZ72F8eQ4l1TUvpPHchdqLrt3Bss7FvW+S3CoekuJ5mU+IpwU89ysqvq4IlxBxWKuN3izIMU4RAwEVjwXq+Vx+DOmzybcjeEZjAHGla1DXM=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKZt6Yr0C2K1GjhwxZVpjygx93YSV+Kn2o3esvFwDMRm
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBMCcM9Hgo8BLw2v9kVQRHHGnx2F+g3uprLpMEIB0ZbSd30J5BrC2jHJqTomvp4SA5QAfVkJ+ut1Y1d53HnGr7m4=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCaTks/E7f5Lj8vRjYiETYezDsJeJPvGJ7iWQCLHOOdyd7hyB6t07iHeC/DifdbxAFQbQGv8UxPN82XwMztGI2uIxDo4mWjZiEAjZtcbQQZc+CXdx9S1ijeixzlc2O1FY9SimbTIqdCYCUHMLb1uN05MdHKuPx9Fr6L37DyKII0NO8u4OI7kcxciG/UYjop5PPiTvuIENkGV8rNaezwa+TvIznJm9tP7Hvehgv92XlrxO6I1uieoBYI29fH1qX8lVEA+vpn3Wmaw5tdLoPiEnmVaMqGWejJYIC9loVJBFttS/WJs+CSG+CQCdLnlNH7oSbYexXkB5PA8PlYz9Q4gFBUbixkRdzqn98F6S5CTAKRCxhPdLleii7FV2CSwOvI9V1ELpAo5jz2g2RfUCJo3SFBDIiTFVq85/e5hrHuZ21hTOu6YWySung/8nRE4F+ahhbBdHqw7BFrA06ISVpKLwQgcn2Vi5PH/4/SixVTnW3RlPBcjjWtuOEKSuT6PnCl/Fk=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFRR0ySWIg/GxeGtE8MjO0ju0tcWLKiNhfq4BdiEzwWG
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBMAjK9bgTQm22J/TufQOPCoBUOWObkUWzI43Xua9Z8rsvSkIYFwzoFL869AzDAB3xMU6t7+K22B2mAZLsVZkFaI=
                                             create=True mode=0644 path=/tmp/ansible.aw9yrkga state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:45:06 compute-1 sudo[75000]: pam_unix(sudo:session): session closed for user root
Jan 26 14:45:06 compute-1 sudo[75152]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdjaclxttftelshzymfmoabmpwjbqamn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438706.3460279-95-117911048574621/AnsiballZ_command.py'
Jan 26 14:45:06 compute-1 sudo[75152]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:45:06 compute-1 python3.9[75154]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.aw9yrkga' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:45:06 compute-1 sudo[75152]: pam_unix(sudo:session): session closed for user root
Jan 26 14:45:07 compute-1 sshd-session[72579]: error: maximum authentication attempts exceeded for root from 185.246.128.170 port 40989 ssh2 [preauth]
Jan 26 14:45:07 compute-1 sshd-session[72579]: Disconnecting authenticating user root 185.246.128.170 port 40989: Too many authentication failures [preauth]
Jan 26 14:45:07 compute-1 sudo[75306]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pcyfgccrqjfyyeemqjkyywhrousjvlav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438707.1313078-111-115062095510256/AnsiballZ_file.py'
Jan 26 14:45:07 compute-1 sudo[75306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:45:07 compute-1 python3.9[75308]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.aw9yrkga state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:45:07 compute-1 sudo[75306]: pam_unix(sudo:session): session closed for user root
Jan 26 14:45:08 compute-1 sshd-session[74391]: Connection closed by 192.168.122.30 port 59198
Jan 26 14:45:08 compute-1 sshd-session[74388]: pam_unix(sshd:session): session closed for user zuul
Jan 26 14:45:08 compute-1 systemd[1]: session-17.scope: Deactivated successfully.
Jan 26 14:45:08 compute-1 systemd[1]: session-17.scope: Consumed 3.454s CPU time.
Jan 26 14:45:08 compute-1 systemd-logind[795]: Session 17 logged out. Waiting for processes to exit.
Jan 26 14:45:08 compute-1 systemd-logind[795]: Removed session 17.
Jan 26 14:45:13 compute-1 sshd-session[75333]: Accepted publickey for zuul from 192.168.122.30 port 38624 ssh2: ECDSA SHA256:jdvz2lyr5EyIYu+bvBKw53Euf9HIejup115smDZmKeU
Jan 26 14:45:13 compute-1 systemd-logind[795]: New session 18 of user zuul.
Jan 26 14:45:13 compute-1 systemd[1]: Started Session 18 of User zuul.
Jan 26 14:45:13 compute-1 sshd-session[75333]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 14:45:14 compute-1 python3.9[75486]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 14:45:15 compute-1 sudo[75641]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmojfyuwtazsufgsubccxlheejtkalgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438714.6338735-40-223734849877812/AnsiballZ_systemd.py'
Jan 26 14:45:15 compute-1 sudo[75641]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:45:15 compute-1 python3.9[75643]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 26 14:45:15 compute-1 sudo[75641]: pam_unix(sudo:session): session closed for user root
Jan 26 14:45:16 compute-1 sudo[75796]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpqndgdotirirismydzxkfostdvtrqgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438715.9558084-56-65156841683143/AnsiballZ_systemd.py'
Jan 26 14:45:16 compute-1 sudo[75796]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:45:16 compute-1 python3.9[75798]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 14:45:16 compute-1 sudo[75796]: pam_unix(sudo:session): session closed for user root
Jan 26 14:45:17 compute-1 sudo[75949]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ysmtwbwdgdnhvtavbukxaybmugjymlzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438716.9120302-74-278854738018202/AnsiballZ_command.py'
Jan 26 14:45:17 compute-1 sudo[75949]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:45:17 compute-1 python3.9[75951]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:45:17 compute-1 sudo[75949]: pam_unix(sudo:session): session closed for user root
Jan 26 14:45:18 compute-1 sudo[76102]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfzzbtchlvagojljvabtjftxbapgvfsr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438717.7789352-90-115171630184801/AnsiballZ_stat.py'
Jan 26 14:45:18 compute-1 sudo[76102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:45:18 compute-1 python3.9[76104]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 14:45:18 compute-1 sudo[76102]: pam_unix(sudo:session): session closed for user root
Jan 26 14:45:19 compute-1 sudo[76256]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nucctqmlrgzxchjqmxwvaqetqniodlzj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438718.6881728-106-279014402810349/AnsiballZ_command.py'
Jan 26 14:45:19 compute-1 sudo[76256]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:45:19 compute-1 python3.9[76258]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:45:19 compute-1 sudo[76256]: pam_unix(sudo:session): session closed for user root
Jan 26 14:45:19 compute-1 sudo[76411]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oitsoliiypgotmowjdqntjmbojpycdyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438719.4766674-122-50765914300843/AnsiballZ_file.py'
Jan 26 14:45:19 compute-1 sudo[76411]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:45:20 compute-1 python3.9[76413]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:45:20 compute-1 sudo[76411]: pam_unix(sudo:session): session closed for user root
Jan 26 14:45:20 compute-1 sshd-session[75336]: Connection closed by 192.168.122.30 port 38624
Jan 26 14:45:20 compute-1 sshd-session[75333]: pam_unix(sshd:session): session closed for user zuul
Jan 26 14:45:20 compute-1 systemd[1]: session-18.scope: Deactivated successfully.
Jan 26 14:45:20 compute-1 systemd[1]: session-18.scope: Consumed 4.885s CPU time.
Jan 26 14:45:20 compute-1 systemd-logind[795]: Session 18 logged out. Waiting for processes to exit.
Jan 26 14:45:20 compute-1 systemd-logind[795]: Removed session 18.
Jan 26 14:45:21 compute-1 sshd-session[75543]: error: maximum authentication attempts exceeded for root from 185.246.128.170 port 45319 ssh2 [preauth]
Jan 26 14:45:21 compute-1 sshd-session[75543]: Disconnecting authenticating user root 185.246.128.170 port 45319: Too many authentication failures [preauth]
Jan 26 14:45:25 compute-1 sshd-session[76440]: Accepted publickey for zuul from 192.168.122.30 port 47262 ssh2: ECDSA SHA256:jdvz2lyr5EyIYu+bvBKw53Euf9HIejup115smDZmKeU
Jan 26 14:45:25 compute-1 systemd-logind[795]: New session 19 of user zuul.
Jan 26 14:45:25 compute-1 systemd[1]: Started Session 19 of User zuul.
Jan 26 14:45:25 compute-1 sshd-session[76440]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 14:45:26 compute-1 python3.9[76593]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 14:45:27 compute-1 sudo[76747]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqqzingyureiutqyxxjsqfvkflzatywy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438726.9446287-44-151470904432574/AnsiballZ_setup.py'
Jan 26 14:45:27 compute-1 sudo[76747]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:45:27 compute-1 python3.9[76749]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 14:45:27 compute-1 sudo[76747]: pam_unix(sudo:session): session closed for user root
Jan 26 14:45:28 compute-1 sudo[76831]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktrcysnasuldwpxizvldbwhkpbislsnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438726.9446287-44-151470904432574/AnsiballZ_dnf.py'
Jan 26 14:45:28 compute-1 sudo[76831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:45:28 compute-1 python3.9[76833]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 26 14:45:30 compute-1 sudo[76831]: pam_unix(sudo:session): session closed for user root
Jan 26 14:45:31 compute-1 python3.9[76984]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:45:32 compute-1 python3.9[77135]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 26 14:45:33 compute-1 python3.9[77285]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 14:45:34 compute-1 python3.9[77435]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 14:45:34 compute-1 sshd-session[76438]: error: maximum authentication attempts exceeded for root from 185.246.128.170 port 57331 ssh2 [preauth]
Jan 26 14:45:34 compute-1 sshd-session[76438]: Disconnecting authenticating user root 185.246.128.170 port 57331: Too many authentication failures [preauth]
Jan 26 14:45:34 compute-1 sshd-session[76443]: Connection closed by 192.168.122.30 port 47262
Jan 26 14:45:34 compute-1 sshd-session[76440]: pam_unix(sshd:session): session closed for user zuul
Jan 26 14:45:34 compute-1 systemd[1]: session-19.scope: Deactivated successfully.
Jan 26 14:45:34 compute-1 systemd[1]: session-19.scope: Consumed 6.134s CPU time.
Jan 26 14:45:34 compute-1 systemd-logind[795]: Session 19 logged out. Waiting for processes to exit.
Jan 26 14:45:34 compute-1 systemd-logind[795]: Removed session 19.
Jan 26 14:45:39 compute-1 sshd-session[77462]: Accepted publickey for zuul from 192.168.122.30 port 53932 ssh2: ECDSA SHA256:jdvz2lyr5EyIYu+bvBKw53Euf9HIejup115smDZmKeU
Jan 26 14:45:39 compute-1 systemd-logind[795]: New session 20 of user zuul.
Jan 26 14:45:39 compute-1 systemd[1]: Started Session 20 of User zuul.
Jan 26 14:45:39 compute-1 sshd-session[77462]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 14:45:40 compute-1 python3.9[77615]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 14:45:43 compute-1 sudo[77769]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffstaflofkdgmrrpcegqdrxwenrhdhzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438742.7678747-76-93342126716670/AnsiballZ_file.py'
Jan 26 14:45:43 compute-1 sudo[77769]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:45:43 compute-1 python3.9[77771]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:45:43 compute-1 sudo[77769]: pam_unix(sudo:session): session closed for user root
Jan 26 14:45:43 compute-1 sudo[77921]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejypzytvzfdccqjjldsqncjqeaqgtobj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438743.5674734-76-100007380989608/AnsiballZ_file.py'
Jan 26 14:45:43 compute-1 sudo[77921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:45:44 compute-1 python3.9[77923]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:45:44 compute-1 sudo[77921]: pam_unix(sudo:session): session closed for user root
Jan 26 14:45:44 compute-1 sudo[78073]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwyjhbqdnpyxeycdfxsezdivtcxiuzcq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438744.2511134-104-19047965311658/AnsiballZ_stat.py'
Jan 26 14:45:44 compute-1 sudo[78073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:45:44 compute-1 python3.9[78075]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:45:44 compute-1 sudo[78073]: pam_unix(sudo:session): session closed for user root
Jan 26 14:45:45 compute-1 sudo[78196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvnpofwrihmumzzpipriivpghybzvikv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438744.2511134-104-19047965311658/AnsiballZ_copy.py'
Jan 26 14:45:45 compute-1 sudo[78196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:45:45 compute-1 python3.9[78198]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769438744.2511134-104-19047965311658/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=59ba278d4816adb8fe423d82b9c91d43f242f66b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:45:45 compute-1 sudo[78196]: pam_unix(sudo:session): session closed for user root
Jan 26 14:45:45 compute-1 sudo[78348]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xistlmcxrqlbfhhbcofjvytggsopauts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438745.6913402-104-233873335797189/AnsiballZ_stat.py'
Jan 26 14:45:45 compute-1 sudo[78348]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:45:46 compute-1 sshd-session[77460]: error: maximum authentication attempts exceeded for root from 185.246.128.170 port 55570 ssh2 [preauth]
Jan 26 14:45:46 compute-1 sshd-session[77460]: Disconnecting authenticating user root 185.246.128.170 port 55570: Too many authentication failures [preauth]
Jan 26 14:45:46 compute-1 python3.9[78350]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:45:46 compute-1 sudo[78348]: pam_unix(sudo:session): session closed for user root
Jan 26 14:45:46 compute-1 sudo[78471]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udsrdvevmvfqarhwkuyzyrnsagpnmkuo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438745.6913402-104-233873335797189/AnsiballZ_copy.py'
Jan 26 14:45:46 compute-1 sudo[78471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:45:46 compute-1 python3.9[78473]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769438745.6913402-104-233873335797189/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=8c2687ded664dabfc7eef0bad6deb73accdd8441 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:45:46 compute-1 sudo[78471]: pam_unix(sudo:session): session closed for user root
Jan 26 14:45:47 compute-1 sudo[78625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drigkfkvfnzzwopouhwdjtuvylipfsci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438746.85359-104-57789197499047/AnsiballZ_stat.py'
Jan 26 14:45:47 compute-1 sudo[78625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:45:47 compute-1 python3.9[78627]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:45:47 compute-1 sudo[78625]: pam_unix(sudo:session): session closed for user root
Jan 26 14:45:47 compute-1 sudo[78748]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lrfonjydcrgmklrzrgiycalhucfstpto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438746.85359-104-57789197499047/AnsiballZ_copy.py'
Jan 26 14:45:47 compute-1 sudo[78748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:45:47 compute-1 python3.9[78750]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769438746.85359-104-57789197499047/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=5b74d6b511edb20a4036d1147c4c912377236750 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:45:47 compute-1 sudo[78748]: pam_unix(sudo:session): session closed for user root
Jan 26 14:45:48 compute-1 sudo[78900]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxykakxzasgoaljjkgifuerlsstdbmwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438748.0603218-193-122448272850765/AnsiballZ_file.py'
Jan 26 14:45:48 compute-1 sudo[78900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:45:48 compute-1 python3.9[78902]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:45:48 compute-1 sudo[78900]: pam_unix(sudo:session): session closed for user root
Jan 26 14:45:48 compute-1 sudo[79052]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ecudddgrmshwjlmpbvssngajvfsmyejf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438748.6835904-193-164590127507639/AnsiballZ_file.py'
Jan 26 14:45:48 compute-1 sudo[79052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:45:49 compute-1 python3.9[79054]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:45:49 compute-1 sudo[79052]: pam_unix(sudo:session): session closed for user root
Jan 26 14:45:49 compute-1 sudo[79204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbavveiunepyreiakocwjspkvmrykujd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438749.3753362-226-71624057965457/AnsiballZ_stat.py'
Jan 26 14:45:49 compute-1 sudo[79204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:45:49 compute-1 python3.9[79206]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:45:49 compute-1 sudo[79204]: pam_unix(sudo:session): session closed for user root
Jan 26 14:45:50 compute-1 sudo[79327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjdrlsqqiqvfujgilvmwnukhkryqiqoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438749.3753362-226-71624057965457/AnsiballZ_copy.py'
Jan 26 14:45:50 compute-1 sudo[79327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:45:50 compute-1 python3.9[79329]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769438749.3753362-226-71624057965457/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=eb7f9abaea6ced10e7b46a8a68f15f9eb433d9a8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:45:50 compute-1 sudo[79327]: pam_unix(sudo:session): session closed for user root
Jan 26 14:45:50 compute-1 sudo[79479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfyvfzyprjthjvplezacqleccbpklssx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438750.5457456-226-57747292080425/AnsiballZ_stat.py'
Jan 26 14:45:50 compute-1 sudo[79479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:45:51 compute-1 python3.9[79481]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:45:51 compute-1 sudo[79479]: pam_unix(sudo:session): session closed for user root
Jan 26 14:45:51 compute-1 sudo[79602]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjjkcuiotxrgumwmatocyekxzntpjrbe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438750.5457456-226-57747292080425/AnsiballZ_copy.py'
Jan 26 14:45:51 compute-1 sudo[79602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:45:51 compute-1 python3.9[79604]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769438750.5457456-226-57747292080425/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=40578f23e1702ecc72ae9db5e528e925bc6b32ce backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:45:51 compute-1 sudo[79602]: pam_unix(sudo:session): session closed for user root
Jan 26 14:45:52 compute-1 sudo[79754]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdoohmmehgtwfbvxoqhhfigohuikkwhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438751.8597455-226-241783103048193/AnsiballZ_stat.py'
Jan 26 14:45:52 compute-1 sudo[79754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:45:52 compute-1 python3.9[79756]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:45:52 compute-1 sudo[79754]: pam_unix(sudo:session): session closed for user root
Jan 26 14:45:52 compute-1 sudo[79877]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kilgjiqwwikfvzssvxtccblcdqekvfxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438751.8597455-226-241783103048193/AnsiballZ_copy.py'
Jan 26 14:45:52 compute-1 sudo[79877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:45:52 compute-1 python3.9[79879]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769438751.8597455-226-241783103048193/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=cb8daf4643c0370d62b2695b43b1f6f8d338e62e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:45:52 compute-1 sudo[79877]: pam_unix(sudo:session): session closed for user root
Jan 26 14:45:53 compute-1 sudo[80029]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkrpmvbdyphtqasgakpiaowgdjwxjnan ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438753.07336-318-33618198384042/AnsiballZ_file.py'
Jan 26 14:45:53 compute-1 sudo[80029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:45:53 compute-1 python3.9[80031]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:45:53 compute-1 sudo[80029]: pam_unix(sudo:session): session closed for user root
Jan 26 14:45:53 compute-1 sudo[80181]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohnxdmmknopyevocrwbugqbfpiwbcozb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438753.7250905-318-27795240992201/AnsiballZ_file.py'
Jan 26 14:45:54 compute-1 sudo[80181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:45:54 compute-1 python3.9[80183]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:45:54 compute-1 sudo[80181]: pam_unix(sudo:session): session closed for user root
Jan 26 14:45:54 compute-1 sshd-session[78498]: error: maximum authentication attempts exceeded for root from 185.246.128.170 port 34391 ssh2 [preauth]
Jan 26 14:45:54 compute-1 sshd-session[78498]: Disconnecting authenticating user root 185.246.128.170 port 34391: Too many authentication failures [preauth]
Jan 26 14:45:54 compute-1 sudo[80333]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ceoluzfbuqjqpdcfwpajsfblgsmjnnxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438754.4008784-348-29290909622928/AnsiballZ_stat.py'
Jan 26 14:45:54 compute-1 sudo[80333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:45:54 compute-1 python3.9[80335]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:45:54 compute-1 sudo[80333]: pam_unix(sudo:session): session closed for user root
Jan 26 14:45:55 compute-1 sudo[80456]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xlyhqutlgcsomneiovxxgicyaulxwbeg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438754.4008784-348-29290909622928/AnsiballZ_copy.py'
Jan 26 14:45:55 compute-1 sudo[80456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:45:55 compute-1 python3.9[80458]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769438754.4008784-348-29290909622928/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=7c02479c0ce9ad7ca654411b698cfc33a996d0db backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:45:55 compute-1 sudo[80456]: pam_unix(sudo:session): session closed for user root
Jan 26 14:45:56 compute-1 sudo[80608]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amclyzbzfbaqlimczdffhwiljnjarlag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438755.7535865-348-68940074344642/AnsiballZ_stat.py'
Jan 26 14:45:56 compute-1 sudo[80608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:45:56 compute-1 python3.9[80610]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:45:56 compute-1 sudo[80608]: pam_unix(sudo:session): session closed for user root
Jan 26 14:45:56 compute-1 sudo[80731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzrdaygxersrqcsktkyeyyrxlmjzmqih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438755.7535865-348-68940074344642/AnsiballZ_copy.py'
Jan 26 14:45:56 compute-1 sudo[80731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:45:56 compute-1 python3.9[80733]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769438755.7535865-348-68940074344642/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=4ccdbd90259e3a7e1699d2b1db675be8589288af backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:45:56 compute-1 sudo[80731]: pam_unix(sudo:session): session closed for user root
Jan 26 14:45:57 compute-1 sudo[80885]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmkhllbsqutaejvpyzoogmpelslmptje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438756.9213474-348-82303543603414/AnsiballZ_stat.py'
Jan 26 14:45:57 compute-1 sudo[80885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:45:57 compute-1 python3.9[80887]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:45:57 compute-1 sudo[80885]: pam_unix(sudo:session): session closed for user root
Jan 26 14:45:57 compute-1 sudo[81008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iulmigocygaiwwerhydwdfkgvceazogy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438756.9213474-348-82303543603414/AnsiballZ_copy.py'
Jan 26 14:45:57 compute-1 sudo[81008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:45:57 compute-1 python3.9[81010]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769438756.9213474-348-82303543603414/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=2a84df7858e256fb93cf6de601b38a4254851740 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:45:57 compute-1 sudo[81008]: pam_unix(sudo:session): session closed for user root
Jan 26 14:45:58 compute-1 sudo[81160]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzdpugqsnclyewioaxplhsrbfiowatgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438758.183595-437-172078723829616/AnsiballZ_file.py'
Jan 26 14:45:58 compute-1 sudo[81160]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:45:58 compute-1 python3.9[81162]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:45:58 compute-1 sudo[81160]: pam_unix(sudo:session): session closed for user root
Jan 26 14:45:59 compute-1 sudo[81312]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oanletrpqjcldjjadigqtwemvtgdiqag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438758.8361945-437-220467910102912/AnsiballZ_file.py'
Jan 26 14:45:59 compute-1 sudo[81312]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:45:59 compute-1 python3.9[81314]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:45:59 compute-1 sudo[81312]: pam_unix(sudo:session): session closed for user root
Jan 26 14:45:59 compute-1 sudo[81464]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmlmdeggyidaitacustejivyhydtujmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438759.541146-471-126054530403990/AnsiballZ_stat.py'
Jan 26 14:45:59 compute-1 sudo[81464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:00 compute-1 python3.9[81466]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:46:00 compute-1 sudo[81464]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:00 compute-1 sudo[81587]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-usmyjzftdodgxtbvvblxlnswtvjvfcnb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438759.541146-471-126054530403990/AnsiballZ_copy.py'
Jan 26 14:46:00 compute-1 sudo[81587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:00 compute-1 python3.9[81589]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769438759.541146-471-126054530403990/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=0287e21c446c914945e23509271e9686a42c5435 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:46:00 compute-1 sudo[81587]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:01 compute-1 sudo[81739]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjkyngdgflehlnsruoauunljawzcpovv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438760.8186793-471-258603675562199/AnsiballZ_stat.py'
Jan 26 14:46:01 compute-1 sudo[81739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:01 compute-1 python3.9[81741]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:46:01 compute-1 sudo[81739]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:01 compute-1 sudo[81862]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqtgbfjzgsomloqvibrkejqgqcjdsboi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438760.8186793-471-258603675562199/AnsiballZ_copy.py'
Jan 26 14:46:01 compute-1 sudo[81862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:01 compute-1 python3.9[81864]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769438760.8186793-471-258603675562199/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=4ccdbd90259e3a7e1699d2b1db675be8589288af backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:46:01 compute-1 sudo[81862]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:02 compute-1 sudo[82014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hauwjoqnvujkhbdkeuhjatvouukliayx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438762.029933-471-49011043545542/AnsiballZ_stat.py'
Jan 26 14:46:02 compute-1 sudo[82014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:02 compute-1 chronyd[65168]: Selected source 167.160.187.179 (pool.ntp.org)
Jan 26 14:46:02 compute-1 python3.9[82016]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:46:02 compute-1 sudo[82014]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:02 compute-1 sudo[82137]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-puxyvifuikaxhtsduxqfapblvhujwvkk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438762.029933-471-49011043545542/AnsiballZ_copy.py'
Jan 26 14:46:02 compute-1 sudo[82137]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:03 compute-1 python3.9[82139]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769438762.029933-471-49011043545542/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=f478cbe9705d2694c321f934353023f3066bbab5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:46:03 compute-1 sudo[82137]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:03 compute-1 sudo[82289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifjekrrebqcbtenqrqefuovhgnovgtfi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438763.6717296-592-114007028777593/AnsiballZ_file.py'
Jan 26 14:46:03 compute-1 sudo[82289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:04 compute-1 python3.9[82291]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:46:04 compute-1 sudo[82289]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:04 compute-1 sudo[82441]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-invztsvajwkrspvfsakyhshajljyetxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438764.3118882-607-247041034109421/AnsiballZ_stat.py'
Jan 26 14:46:04 compute-1 sudo[82441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:04 compute-1 python3.9[82443]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:46:04 compute-1 sudo[82441]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:05 compute-1 sudo[82564]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pycuauzacxulzftevcrjvztvnutsigbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438764.3118882-607-247041034109421/AnsiballZ_copy.py'
Jan 26 14:46:05 compute-1 sudo[82564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:05 compute-1 python3.9[82566]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769438764.3118882-607-247041034109421/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=539478296eac4798b42b7e16b49efacc0999af66 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:46:05 compute-1 sudo[82564]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:05 compute-1 sudo[82716]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-booeuatflxachfuqbwkcgobczcrvwzai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438765.6019475-636-197851827737235/AnsiballZ_file.py'
Jan 26 14:46:05 compute-1 sudo[82716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:06 compute-1 python3.9[82718]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:46:06 compute-1 sudo[82716]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:06 compute-1 sudo[82868]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ameesrghrdobyyegytjluuygsrvjcbpc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438766.2751975-655-1681150039951/AnsiballZ_stat.py'
Jan 26 14:46:06 compute-1 sudo[82868]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:06 compute-1 python3.9[82870]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:46:06 compute-1 sudo[82868]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:07 compute-1 sudo[82991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rurwzzymiidgfmkxjjjufcuxqfxmjssm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438766.2751975-655-1681150039951/AnsiballZ_copy.py'
Jan 26 14:46:07 compute-1 sudo[82991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:07 compute-1 python3.9[82993]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769438766.2751975-655-1681150039951/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=539478296eac4798b42b7e16b49efacc0999af66 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:46:07 compute-1 sshd-session[80734]: error: maximum authentication attempts exceeded for root from 185.246.128.170 port 23397 ssh2 [preauth]
Jan 26 14:46:07 compute-1 sshd-session[80734]: Disconnecting authenticating user root 185.246.128.170 port 23397: Too many authentication failures [preauth]
Jan 26 14:46:07 compute-1 sudo[82991]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:07 compute-1 sudo[83143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rftwtrvacijqkerhvhgteoehkfqagdqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438767.4884613-686-217495297962316/AnsiballZ_file.py'
Jan 26 14:46:07 compute-1 sudo[83143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:07 compute-1 python3.9[83145]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:46:07 compute-1 sudo[83143]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:08 compute-1 sudo[83295]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpohwewwasiruhsljjckhxbdfjxxkpfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438768.0799484-701-41436415855845/AnsiballZ_stat.py'
Jan 26 14:46:08 compute-1 sudo[83295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:08 compute-1 python3.9[83297]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:46:08 compute-1 sudo[83295]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:08 compute-1 sudo[83418]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqmoycrgmqxxusyvrjdorauvqovwnyug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438768.0799484-701-41436415855845/AnsiballZ_copy.py'
Jan 26 14:46:08 compute-1 sudo[83418]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:09 compute-1 python3.9[83420]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769438768.0799484-701-41436415855845/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=539478296eac4798b42b7e16b49efacc0999af66 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:46:09 compute-1 sudo[83418]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:09 compute-1 sudo[83570]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdzyygzkvucfqaofehciizodunladffr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438769.429277-731-190937628354799/AnsiballZ_file.py'
Jan 26 14:46:09 compute-1 sudo[83570]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:09 compute-1 python3.9[83572]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:46:09 compute-1 sudo[83570]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:10 compute-1 sudo[83722]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iupccyenfbobbkctrzqusmvafnlkuasj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438770.1115994-749-266463557247236/AnsiballZ_stat.py'
Jan 26 14:46:10 compute-1 sudo[83722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:10 compute-1 python3.9[83724]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:46:10 compute-1 sudo[83722]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:10 compute-1 sudo[83847]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-situjvcbijivkyddfslslvolmjmmtozz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438770.1115994-749-266463557247236/AnsiballZ_copy.py'
Jan 26 14:46:10 compute-1 sudo[83847]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:11 compute-1 python3.9[83849]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769438770.1115994-749-266463557247236/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=539478296eac4798b42b7e16b49efacc0999af66 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:46:11 compute-1 sudo[83847]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:11 compute-1 sudo[83999]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixaydjejnczktwzzaypsbttqxguekeyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438771.3523295-779-151522365180420/AnsiballZ_file.py'
Jan 26 14:46:11 compute-1 sudo[83999]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:11 compute-1 python3.9[84001]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:46:11 compute-1 sudo[83999]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:12 compute-1 sudo[84151]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-elogqzncthntiroazetlpyeyoqrayajq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438771.9781082-794-152302563251042/AnsiballZ_stat.py'
Jan 26 14:46:12 compute-1 sudo[84151]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:12 compute-1 python3.9[84153]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:46:12 compute-1 sudo[84151]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:12 compute-1 sudo[84274]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwmbictznenpalhqzefdmyeefvtntucy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438771.9781082-794-152302563251042/AnsiballZ_copy.py'
Jan 26 14:46:12 compute-1 sudo[84274]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:12 compute-1 python3.9[84276]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769438771.9781082-794-152302563251042/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=539478296eac4798b42b7e16b49efacc0999af66 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:46:13 compute-1 sudo[84274]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:13 compute-1 sudo[84426]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmhkypcjtchvfhmltetnaedlxwcfykvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438773.1810489-826-70528227817009/AnsiballZ_file.py'
Jan 26 14:46:13 compute-1 sudo[84426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:13 compute-1 python3.9[84428]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:46:13 compute-1 sudo[84426]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:14 compute-1 sudo[84578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kceaxfupdfatbksdnzegmdhdxwoyrjzs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438773.80472-842-210222388336561/AnsiballZ_stat.py'
Jan 26 14:46:14 compute-1 sudo[84578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:14 compute-1 python3.9[84580]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:46:14 compute-1 sudo[84578]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:14 compute-1 sudo[84701]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-moeuoiksgpepdgjjbiierdvaqaabytwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438773.80472-842-210222388336561/AnsiballZ_copy.py'
Jan 26 14:46:14 compute-1 sudo[84701]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:14 compute-1 python3.9[84703]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769438773.80472-842-210222388336561/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=539478296eac4798b42b7e16b49efacc0999af66 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:46:15 compute-1 sudo[84701]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:15 compute-1 sudo[84853]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dejggrlpnpcjhcuetztjqeksgukyjjkt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438775.1999054-874-85932892089852/AnsiballZ_file.py'
Jan 26 14:46:15 compute-1 sudo[84853]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:15 compute-1 python3.9[84855]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:46:15 compute-1 sudo[84853]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:16 compute-1 sudo[85005]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxgndakfaknavwaburrcnnutbyfxnivt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438775.8174205-889-101081213384550/AnsiballZ_stat.py'
Jan 26 14:46:16 compute-1 sudo[85005]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:16 compute-1 python3.9[85007]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:46:16 compute-1 sudo[85005]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:16 compute-1 sudo[85128]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpgwvuthpdnaaexuezltxranjxjowvho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438775.8174205-889-101081213384550/AnsiballZ_copy.py'
Jan 26 14:46:16 compute-1 sudo[85128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:16 compute-1 python3.9[85130]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769438775.8174205-889-101081213384550/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=539478296eac4798b42b7e16b49efacc0999af66 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:46:16 compute-1 sudo[85128]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:17 compute-1 sshd-session[77465]: Connection closed by 192.168.122.30 port 53932
Jan 26 14:46:17 compute-1 sshd-session[77462]: pam_unix(sshd:session): session closed for user zuul
Jan 26 14:46:17 compute-1 systemd-logind[795]: Session 20 logged out. Waiting for processes to exit.
Jan 26 14:46:17 compute-1 systemd[1]: session-20.scope: Deactivated successfully.
Jan 26 14:46:17 compute-1 systemd[1]: session-20.scope: Consumed 28.208s CPU time.
Jan 26 14:46:17 compute-1 systemd-logind[795]: Removed session 20.
Jan 26 14:46:23 compute-1 sshd-session[85155]: Accepted publickey for zuul from 192.168.122.30 port 40644 ssh2: ECDSA SHA256:jdvz2lyr5EyIYu+bvBKw53Euf9HIejup115smDZmKeU
Jan 26 14:46:23 compute-1 systemd-logind[795]: New session 21 of user zuul.
Jan 26 14:46:23 compute-1 systemd[1]: Started Session 21 of User zuul.
Jan 26 14:46:23 compute-1 sshd-session[85155]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 14:46:24 compute-1 python3.9[85308]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 14:46:24 compute-1 sudo[85462]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydbxuggztlccgqfdgmiqrszbzlzcepwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438784.576407-44-168916742152881/AnsiballZ_file.py'
Jan 26 14:46:24 compute-1 sudo[85462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:25 compute-1 python3.9[85464]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:46:25 compute-1 sudo[85462]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:25 compute-1 sudo[85614]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-msjtrfxwuacfcxxoxncdlbiwienzqkqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438785.4089742-44-102929216909932/AnsiballZ_file.py'
Jan 26 14:46:25 compute-1 sudo[85614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:25 compute-1 python3.9[85616]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:46:25 compute-1 sudo[85614]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:26 compute-1 python3.9[85766]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 14:46:27 compute-1 sudo[85916]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlewrahvdkmtosvumyzyuqnbhrwmhgdl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438786.9358003-90-47198692454965/AnsiballZ_seboolean.py'
Jan 26 14:46:27 compute-1 sudo[85916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:27 compute-1 python3.9[85918]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 26 14:46:27 compute-1 sshd-session[83725]: error: maximum authentication attempts exceeded for root from 185.246.128.170 port 48417 ssh2 [preauth]
Jan 26 14:46:27 compute-1 sshd-session[83725]: Disconnecting authenticating user root 185.246.128.170 port 48417: Too many authentication failures [preauth]
Jan 26 14:46:28 compute-1 sudo[85916]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:29 compute-1 sudo[86072]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eelfnyvezzfiauqsbleavcnwolnkkrir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438789.17817-110-129279752576882/AnsiballZ_setup.py'
Jan 26 14:46:29 compute-1 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Jan 26 14:46:29 compute-1 sudo[86072]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:29 compute-1 python3.9[86074]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 14:46:30 compute-1 sudo[86072]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:30 compute-1 sudo[86157]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oisnhekkbcwaoovppdgfdqpihveodreh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438789.17817-110-129279752576882/AnsiballZ_dnf.py'
Jan 26 14:46:30 compute-1 sudo[86157]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:30 compute-1 python3.9[86159]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 14:46:32 compute-1 sudo[86157]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:33 compute-1 sudo[86312]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlmsskcghjqbhkdcbllklvzjcpvupyoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438792.836384-134-95679615924324/AnsiballZ_systemd.py'
Jan 26 14:46:33 compute-1 sudo[86312]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:33 compute-1 python3.9[86314]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 26 14:46:34 compute-1 sudo[86312]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:35 compute-1 sudo[86467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhzopcujjoazmijfjpqvwejacptzvhie ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769438795.0048892-150-57824853162146/AnsiballZ_edpm_nftables_snippet.py'
Jan 26 14:46:35 compute-1 sudo[86467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:35 compute-1 python3[86469]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                            rule:
                                              proto: udp
                                              dport: 4789
                                          - rule_name: 119 neutron geneve networks
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              state: ["UNTRACKED"]
                                          - rule_name: 120 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: OUTPUT
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                          - rule_name: 121 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: PREROUTING
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                           dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Jan 26 14:46:35 compute-1 sudo[86467]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:36 compute-1 sudo[86619]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtcifxigjpbtzxvfbdjzyxyrwfbassif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438795.909321-168-110901344142307/AnsiballZ_file.py'
Jan 26 14:46:36 compute-1 sudo[86619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:36 compute-1 python3.9[86621]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:46:36 compute-1 sudo[86619]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:37 compute-1 sudo[86771]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ufciyrqoyuapxvyakzaixidstbdflhsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438796.872388-184-198255517006381/AnsiballZ_stat.py'
Jan 26 14:46:37 compute-1 sudo[86771]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:37 compute-1 python3.9[86773]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:46:37 compute-1 sudo[86771]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:37 compute-1 sudo[86849]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbwwtonagoqexeaggecpqzvxdbojulrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438796.872388-184-198255517006381/AnsiballZ_file.py'
Jan 26 14:46:37 compute-1 sudo[86849]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:37 compute-1 python3.9[86851]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:46:37 compute-1 sudo[86849]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:38 compute-1 sudo[87001]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqsgltkskeqjrzltjionunjwtwgajwur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438798.1509855-208-63944180889285/AnsiballZ_stat.py'
Jan 26 14:46:38 compute-1 sudo[87001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:38 compute-1 python3.9[87003]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:46:38 compute-1 sudo[87001]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:38 compute-1 sudo[87079]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spdsbrgjlhipszdlgcijjqbllkxsickd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438798.1509855-208-63944180889285/AnsiballZ_file.py'
Jan 26 14:46:38 compute-1 sudo[87079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:39 compute-1 python3.9[87081]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.m7bq2pm1 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:46:39 compute-1 sudo[87079]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:39 compute-1 sudo[87231]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvvigyhfsduybevoeoecubfobxhespdc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438799.4607332-232-207668376574297/AnsiballZ_stat.py'
Jan 26 14:46:39 compute-1 sudo[87231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:39 compute-1 python3.9[87233]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:46:39 compute-1 sudo[87231]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:40 compute-1 sudo[87309]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqkgpwcqqtzxgtwptvgorvqpxrpgxlbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438799.4607332-232-207668376574297/AnsiballZ_file.py'
Jan 26 14:46:40 compute-1 sudo[87309]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:40 compute-1 python3.9[87311]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:46:40 compute-1 sudo[87309]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:41 compute-1 sudo[87461]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkrefxlunlapivdrhtetifumrkmimhmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438800.633673-258-98354369581964/AnsiballZ_command.py'
Jan 26 14:46:41 compute-1 sudo[87461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:41 compute-1 python3.9[87463]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:46:41 compute-1 sudo[87461]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:42 compute-1 sudo[87614]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmejlltemksxcmhbbfkoazpotqknqdcm ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769438802.5126235-274-140655731424212/AnsiballZ_edpm_nftables_from_files.py'
Jan 26 14:46:42 compute-1 sudo[87614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:43 compute-1 python3[87616]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 26 14:46:43 compute-1 sudo[87614]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:43 compute-1 sudo[87766]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cutzkxuobfxpmodtwwgotyehvwgcrksy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438803.3543699-290-80296155428176/AnsiballZ_stat.py'
Jan 26 14:46:43 compute-1 sudo[87766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:43 compute-1 python3.9[87768]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:46:43 compute-1 sudo[87766]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:44 compute-1 sudo[87891]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bexumsehxdqlawergixhapllsfqvecrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438803.3543699-290-80296155428176/AnsiballZ_copy.py'
Jan 26 14:46:44 compute-1 sudo[87891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:44 compute-1 python3.9[87893]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769438803.3543699-290-80296155428176/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:46:44 compute-1 sudo[87891]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:45 compute-1 sudo[88043]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhrvhtxzrglupwkmseszudzvgnkyehww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438804.8077998-320-248017265055236/AnsiballZ_stat.py'
Jan 26 14:46:45 compute-1 sudo[88043]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:45 compute-1 python3.9[88045]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:46:45 compute-1 sudo[88043]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:45 compute-1 sudo[88168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtgujqfqphqakuiympzcvaugmyldqfrq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438804.8077998-320-248017265055236/AnsiballZ_copy.py'
Jan 26 14:46:45 compute-1 sudo[88168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:45 compute-1 python3.9[88170]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769438804.8077998-320-248017265055236/.source.nft follow=False _original_basename=jump-chain.j2 checksum=ac8dea350c18f51f54d48dacc09613cda4c5540c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:46:45 compute-1 sudo[88168]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:46 compute-1 sudo[88320]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spsqfseddirpqsukoruxmzccuzhhddqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438806.1452215-350-22874069268520/AnsiballZ_stat.py'
Jan 26 14:46:46 compute-1 sudo[88320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:46 compute-1 python3.9[88322]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:46:46 compute-1 sudo[88320]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:47 compute-1 sudo[88445]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orkgelagjzclfvrlgrghaxdjeunzptbb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438806.1452215-350-22874069268520/AnsiballZ_copy.py'
Jan 26 14:46:47 compute-1 sudo[88445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:47 compute-1 sshd-session[86185]: error: maximum authentication attempts exceeded for root from 185.246.128.170 port 53022 ssh2 [preauth]
Jan 26 14:46:47 compute-1 sshd-session[86185]: Disconnecting authenticating user root 185.246.128.170 port 53022: Too many authentication failures [preauth]
Jan 26 14:46:47 compute-1 python3.9[88447]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769438806.1452215-350-22874069268520/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:46:47 compute-1 sudo[88445]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:47 compute-1 sudo[88597]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fulqjqnssqqhaxqentfwxprzsssjdyyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438807.4706464-380-86068265987953/AnsiballZ_stat.py'
Jan 26 14:46:47 compute-1 sudo[88597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:48 compute-1 python3.9[88599]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:46:48 compute-1 sudo[88597]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:48 compute-1 sudo[88722]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zygmxwevhhsmiapzepwagqzufetbeoyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438807.4706464-380-86068265987953/AnsiballZ_copy.py'
Jan 26 14:46:48 compute-1 sudo[88722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:48 compute-1 python3.9[88724]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769438807.4706464-380-86068265987953/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:46:48 compute-1 sudo[88722]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:49 compute-1 sudo[88874]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfgjwlfsfrumrkypqubgxoutgypkbeba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438808.8492064-410-122235021145535/AnsiballZ_stat.py'
Jan 26 14:46:49 compute-1 sudo[88874]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:49 compute-1 python3.9[88876]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:46:49 compute-1 sudo[88874]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:49 compute-1 sudo[88999]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkuykdoclmazejzbuvvrmeueftgqpmoy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438808.8492064-410-122235021145535/AnsiballZ_copy.py'
Jan 26 14:46:49 compute-1 sudo[88999]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:49 compute-1 python3.9[89001]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769438808.8492064-410-122235021145535/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:46:50 compute-1 sudo[88999]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:50 compute-1 sudo[89151]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgfchklhdydfgdcbmaaiyiobncqksmbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438810.1791785-440-278280151651941/AnsiballZ_file.py'
Jan 26 14:46:50 compute-1 sudo[89151]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:50 compute-1 python3.9[89153]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:46:50 compute-1 sudo[89151]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:51 compute-1 sudo[89303]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jkdbymxgqjnlonoldfzefazcvabiekmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438810.843342-456-203883574810809/AnsiballZ_command.py'
Jan 26 14:46:51 compute-1 sudo[89303]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:51 compute-1 python3.9[89305]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:46:51 compute-1 sudo[89303]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:52 compute-1 sudo[89460]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eypmcldizzaxyrijadlpnuqhaqetcuqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438811.5230002-472-85235713984479/AnsiballZ_blockinfile.py'
Jan 26 14:46:52 compute-1 sudo[89460]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:52 compute-1 python3.9[89462]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:46:52 compute-1 sudo[89460]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:52 compute-1 sudo[89612]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlgoztlhnonzoaadmbbdkrimjmihlcfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438812.4874272-490-226423807767044/AnsiballZ_command.py'
Jan 26 14:46:52 compute-1 sudo[89612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:53 compute-1 python3.9[89614]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:46:53 compute-1 sudo[89612]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:53 compute-1 sudo[89765]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utbjummbnddzylehqekouvxbjgperbgp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438813.2701583-506-25312791514348/AnsiballZ_stat.py'
Jan 26 14:46:53 compute-1 sudo[89765]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:53 compute-1 python3.9[89767]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 14:46:53 compute-1 sudo[89765]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:54 compute-1 sudo[89919]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kslbdrcbeulsqmotrnmvvdishmzwlvel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438814.0279634-522-17805878700198/AnsiballZ_command.py'
Jan 26 14:46:54 compute-1 sudo[89919]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:54 compute-1 python3.9[89921]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:46:54 compute-1 sudo[89919]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:54 compute-1 sudo[90074]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-soinxpfvjxuukomdregduzhhmegwtizh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438814.6958568-538-249438177828973/AnsiballZ_file.py'
Jan 26 14:46:54 compute-1 sudo[90074]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:55 compute-1 python3.9[90076]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:46:55 compute-1 sudo[90074]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:56 compute-1 python3.9[90226]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 14:46:57 compute-1 sudo[90377]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pcawkidnretjvokxqpqiqxvmjwpfmzxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438816.9845557-618-249405453487904/AnsiballZ_command.py'
Jan 26 14:46:57 compute-1 sudo[90377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:57 compute-1 python3.9[90379]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:2e:0a:8d:1d:08:09" external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:46:57 compute-1 ovs-vsctl[90380]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:2e:0a:8d:1d:08:09 external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Jan 26 14:46:57 compute-1 sudo[90377]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:57 compute-1 sudo[90530]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svsgslennxmhudqsrjbibozczuyjthor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438817.7075782-636-9936026133428/AnsiballZ_command.py'
Jan 26 14:46:57 compute-1 sudo[90530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:58 compute-1 python3.9[90532]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                            ovs-vsctl show | grep -q "Manager"
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:46:58 compute-1 sudo[90530]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:58 compute-1 sudo[90685]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxepeazvsjdsmrpjjeflrgzzakkwtuwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438818.4244754-652-89479531788255/AnsiballZ_command.py'
Jan 26 14:46:58 compute-1 sudo[90685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:46:58 compute-1 python3.9[90687]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:46:58 compute-1 ovs-vsctl[90688]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Jan 26 14:46:58 compute-1 sudo[90685]: pam_unix(sudo:session): session closed for user root
Jan 26 14:46:59 compute-1 python3.9[90838]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 14:47:00 compute-1 sudo[90991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqkvsndttkkqhqmxtgvqzofxagddrtas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438819.8840442-686-19033546031507/AnsiballZ_file.py'
Jan 26 14:47:00 compute-1 sudo[90991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:47:00 compute-1 python3.9[90993]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:47:00 compute-1 sudo[90991]: pam_unix(sudo:session): session closed for user root
Jan 26 14:47:00 compute-1 sudo[91143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqqtkutzfigwqswutshkbyuaifeuebso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438820.6253896-702-169065564529858/AnsiballZ_stat.py'
Jan 26 14:47:00 compute-1 sudo[91143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:47:01 compute-1 python3.9[91145]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:47:01 compute-1 sudo[91143]: pam_unix(sudo:session): session closed for user root
Jan 26 14:47:01 compute-1 sudo[91221]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oyjczmpifzigyrfopvyrwxwsrgehnliq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438820.6253896-702-169065564529858/AnsiballZ_file.py'
Jan 26 14:47:01 compute-1 sudo[91221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:47:01 compute-1 python3.9[91223]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:47:01 compute-1 sudo[91221]: pam_unix(sudo:session): session closed for user root
Jan 26 14:47:02 compute-1 sudo[91373]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnokygukwfocdvuqzgkawvpartkzwkzd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438821.746858-702-186622705886340/AnsiballZ_stat.py'
Jan 26 14:47:02 compute-1 sudo[91373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:47:02 compute-1 python3.9[91375]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:47:02 compute-1 sudo[91373]: pam_unix(sudo:session): session closed for user root
Jan 26 14:47:02 compute-1 sudo[91451]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovsitjbahiayvassdbazuzvbwttmeake ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438821.746858-702-186622705886340/AnsiballZ_file.py'
Jan 26 14:47:02 compute-1 sudo[91451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:47:02 compute-1 sshd-session[89306]: error: maximum authentication attempts exceeded for root from 185.246.128.170 port 56167 ssh2 [preauth]
Jan 26 14:47:02 compute-1 sshd-session[89306]: Disconnecting authenticating user root 185.246.128.170 port 56167: Too many authentication failures [preauth]
Jan 26 14:47:02 compute-1 python3.9[91453]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:47:02 compute-1 sudo[91451]: pam_unix(sudo:session): session closed for user root
Jan 26 14:47:03 compute-1 sudo[91603]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmxveemxtorjlzvffvjevaeglohmdjai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438822.8667471-748-266437503768887/AnsiballZ_file.py'
Jan 26 14:47:03 compute-1 sudo[91603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:47:03 compute-1 python3.9[91605]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:47:03 compute-1 sudo[91603]: pam_unix(sudo:session): session closed for user root
Jan 26 14:47:03 compute-1 sudo[91756]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlqrngtxyazsudhxaztrddzrhreqisfa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438823.5477817-764-195626692411072/AnsiballZ_stat.py'
Jan 26 14:47:03 compute-1 sudo[91756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:47:04 compute-1 python3.9[91758]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:47:04 compute-1 sudo[91756]: pam_unix(sudo:session): session closed for user root
Jan 26 14:47:04 compute-1 sudo[91834]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tnycswlzipspnlosizjrnzcptbtapmaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438823.5477817-764-195626692411072/AnsiballZ_file.py'
Jan 26 14:47:04 compute-1 sudo[91834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:47:04 compute-1 python3.9[91836]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:47:04 compute-1 sudo[91834]: pam_unix(sudo:session): session closed for user root
Jan 26 14:47:05 compute-1 sudo[91986]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbexryrlrrjferptvrhxgsifwqobvffq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438824.7448862-788-170954540767320/AnsiballZ_stat.py'
Jan 26 14:47:05 compute-1 sudo[91986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:47:05 compute-1 python3.9[91988]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:47:05 compute-1 sudo[91986]: pam_unix(sudo:session): session closed for user root
Jan 26 14:47:05 compute-1 sudo[92064]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-esibgojgjtrtlztfqokyzltkjdkunqeu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438824.7448862-788-170954540767320/AnsiballZ_file.py'
Jan 26 14:47:05 compute-1 sudo[92064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:47:05 compute-1 python3.9[92066]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:47:05 compute-1 sudo[92064]: pam_unix(sudo:session): session closed for user root
Jan 26 14:47:06 compute-1 sudo[92217]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvpwupksosuquxtkzngsuvljelubijpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438825.9107583-812-79575602623136/AnsiballZ_systemd.py'
Jan 26 14:47:06 compute-1 sudo[92217]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:47:06 compute-1 python3.9[92219]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 14:47:06 compute-1 systemd[1]: Reloading.
Jan 26 14:47:06 compute-1 systemd-rc-local-generator[92241]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:47:06 compute-1 systemd-sysv-generator[92245]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 14:47:06 compute-1 sudo[92217]: pam_unix(sudo:session): session closed for user root
Jan 26 14:47:07 compute-1 sudo[92405]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvvpriminefwclsqawzfxfqmvtcgotvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438827.242562-828-225317722233884/AnsiballZ_stat.py'
Jan 26 14:47:07 compute-1 sudo[92405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:47:07 compute-1 python3.9[92407]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:47:07 compute-1 sudo[92405]: pam_unix(sudo:session): session closed for user root
Jan 26 14:47:08 compute-1 sudo[92483]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxnghvyijjideiwnubexcyouwkjqjqcx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438827.242562-828-225317722233884/AnsiballZ_file.py'
Jan 26 14:47:08 compute-1 sudo[92483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:47:08 compute-1 python3.9[92485]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:47:08 compute-1 sudo[92483]: pam_unix(sudo:session): session closed for user root
Jan 26 14:47:08 compute-1 sudo[92635]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kolzcfqmeqjondtfikdsopplzljskfzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438828.435936-852-179712681258022/AnsiballZ_stat.py'
Jan 26 14:47:08 compute-1 sudo[92635]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:47:08 compute-1 python3.9[92637]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:47:08 compute-1 sudo[92635]: pam_unix(sudo:session): session closed for user root
Jan 26 14:47:09 compute-1 sudo[92713]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lcallfxzedkxqzvvzejjaebjfrgomirq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438828.435936-852-179712681258022/AnsiballZ_file.py'
Jan 26 14:47:09 compute-1 sudo[92713]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:47:09 compute-1 python3.9[92715]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:47:09 compute-1 sudo[92713]: pam_unix(sudo:session): session closed for user root
Jan 26 14:47:09 compute-1 sudo[92865]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfqgmilvgyurxoxwntebngcybmwkqbjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438829.6241293-876-38297517460723/AnsiballZ_systemd.py'
Jan 26 14:47:09 compute-1 sudo[92865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:47:10 compute-1 python3.9[92867]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 14:47:10 compute-1 systemd[1]: Reloading.
Jan 26 14:47:10 compute-1 systemd-sysv-generator[92899]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 14:47:10 compute-1 systemd-rc-local-generator[92896]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:47:10 compute-1 systemd[1]: Starting Create netns directory...
Jan 26 14:47:10 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 26 14:47:10 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 26 14:47:10 compute-1 systemd[1]: Finished Create netns directory.
Jan 26 14:47:10 compute-1 sudo[92865]: pam_unix(sudo:session): session closed for user root
Jan 26 14:47:11 compute-1 sudo[93061]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkibbazjuvledisgvonpoqfjudhhbfqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438830.793798-896-25231125022010/AnsiballZ_file.py'
Jan 26 14:47:11 compute-1 sudo[93061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:47:11 compute-1 python3.9[93063]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:47:11 compute-1 sudo[93061]: pam_unix(sudo:session): session closed for user root
Jan 26 14:47:11 compute-1 sudo[93213]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vonriygizyvrdsqxobzpkwftfiqmpsab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438831.4729555-912-97027586335969/AnsiballZ_stat.py'
Jan 26 14:47:11 compute-1 sudo[93213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:47:11 compute-1 python3.9[93215]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:47:11 compute-1 sudo[93213]: pam_unix(sudo:session): session closed for user root
Jan 26 14:47:12 compute-1 sudo[93336]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bekonnecsigmendvvseoqitogltbvpgr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438831.4729555-912-97027586335969/AnsiballZ_copy.py'
Jan 26 14:47:12 compute-1 sudo[93336]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:47:12 compute-1 python3.9[93338]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769438831.4729555-912-97027586335969/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:47:12 compute-1 sudo[93336]: pam_unix(sudo:session): session closed for user root
Jan 26 14:47:13 compute-1 sudo[93488]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynxreotcyeflhoqndvjmdyvulqmvgzpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438832.8911786-946-24844658101444/AnsiballZ_file.py'
Jan 26 14:47:13 compute-1 sudo[93488]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:47:13 compute-1 python3.9[93490]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:47:13 compute-1 sudo[93488]: pam_unix(sudo:session): session closed for user root
Jan 26 14:47:13 compute-1 sudo[93640]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzyspvbtxdxolzpmeytzcqiyfmzbouuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438833.5782514-962-107187766114164/AnsiballZ_file.py'
Jan 26 14:47:13 compute-1 sudo[93640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:47:14 compute-1 python3.9[93642]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:47:14 compute-1 sudo[93640]: pam_unix(sudo:session): session closed for user root
Jan 26 14:47:14 compute-1 sudo[93792]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpzslkyobpzpvnwzjcjesqjgqchpjjit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438834.2821019-978-173697929081161/AnsiballZ_stat.py'
Jan 26 14:47:14 compute-1 sudo[93792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:47:14 compute-1 python3.9[93794]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:47:14 compute-1 sudo[93792]: pam_unix(sudo:session): session closed for user root
Jan 26 14:47:15 compute-1 sudo[93915]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvtdqyjsbdzlidnsralgdpifphnrojfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438834.2821019-978-173697929081161/AnsiballZ_copy.py'
Jan 26 14:47:15 compute-1 sudo[93915]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:47:15 compute-1 python3.9[93917]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769438834.2821019-978-173697929081161/.source.json _original_basename=.gazblhvu follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:47:15 compute-1 sudo[93915]: pam_unix(sudo:session): session closed for user root
Jan 26 14:47:16 compute-1 python3.9[94067]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:47:18 compute-1 sudo[94488]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjppddhyrfrsdrhkhwsaksovrzwebrtw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438837.6022296-1058-260498417392778/AnsiballZ_container_config_data.py'
Jan 26 14:47:18 compute-1 sudo[94488]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:47:18 compute-1 python3.9[94490]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Jan 26 14:47:18 compute-1 sudo[94488]: pam_unix(sudo:session): session closed for user root
Jan 26 14:47:19 compute-1 sudo[94640]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dndadfhjbiamvexkescluvqevewlfbpr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438838.6517265-1080-125243666050478/AnsiballZ_container_config_hash.py'
Jan 26 14:47:19 compute-1 sudo[94640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:47:19 compute-1 python3.9[94642]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 26 14:47:19 compute-1 sudo[94640]: pam_unix(sudo:session): session closed for user root
Jan 26 14:47:20 compute-1 sudo[94792]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lojuthxenuzkfyxoihnpdodjdrbyjlxw ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769438839.6639342-1100-138384080749198/AnsiballZ_edpm_container_manage.py'
Jan 26 14:47:20 compute-1 sudo[94792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:47:20 compute-1 python3[94794]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json containers=['ovn_controller'] log_base_path=/var/log/containers/stdouts debug=False
Jan 26 14:47:20 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 14:47:20 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 14:47:20 compute-1 podman[94830]: 2026-01-26 14:47:20.64845185 +0000 UTC m=+0.050720468 container create b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4)
Jan 26 14:47:20 compute-1 podman[94830]: 2026-01-26 14:47:20.620058846 +0000 UTC m=+0.022327494 image pull f8729094371621355e0152ff34e85f25e048ce5f2426134c9fea76fcb24d5c9d 38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest
Jan 26 14:47:20 compute-1 python3[94794]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7 --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z 38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest
Jan 26 14:47:20 compute-1 sudo[94792]: pam_unix(sudo:session): session closed for user root
Jan 26 14:47:21 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 14:47:21 compute-1 sudo[95017]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odhaejrbblcrpjijmwhdkrwcfdraraqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438841.2090452-1116-198925270324952/AnsiballZ_stat.py'
Jan 26 14:47:21 compute-1 sudo[95017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:47:21 compute-1 python3.9[95019]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 14:47:21 compute-1 sudo[95017]: pam_unix(sudo:session): session closed for user root
Jan 26 14:47:22 compute-1 sudo[95171]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmeokycmqkibiiltfixkkxeqocrzvoub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438841.9909902-1134-87678855159555/AnsiballZ_file.py'
Jan 26 14:47:22 compute-1 sudo[95171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:47:22 compute-1 python3.9[95173]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:47:22 compute-1 sudo[95171]: pam_unix(sudo:session): session closed for user root
Jan 26 14:47:22 compute-1 sudo[95247]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nsetdodambqlygiizcctlmnfqeclmrun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438841.9909902-1134-87678855159555/AnsiballZ_stat.py'
Jan 26 14:47:22 compute-1 sudo[95247]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:47:22 compute-1 python3.9[95249]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 14:47:22 compute-1 sudo[95247]: pam_unix(sudo:session): session closed for user root
Jan 26 14:47:23 compute-1 sudo[95398]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sduawzjjaoquiutuwegcglxitrspmama ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438842.9934053-1134-201096403722454/AnsiballZ_copy.py'
Jan 26 14:47:23 compute-1 sudo[95398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:47:23 compute-1 python3.9[95400]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769438842.9934053-1134-201096403722454/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:47:23 compute-1 sudo[95398]: pam_unix(sudo:session): session closed for user root
Jan 26 14:47:23 compute-1 sudo[95474]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mivbmyfwqkbalyqgfhgjflqwuiijobty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438842.9934053-1134-201096403722454/AnsiballZ_systemd.py'
Jan 26 14:47:23 compute-1 sudo[95474]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:47:24 compute-1 python3.9[95476]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 14:47:24 compute-1 systemd[1]: Reloading.
Jan 26 14:47:24 compute-1 systemd-sysv-generator[95507]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 14:47:24 compute-1 systemd-rc-local-generator[95503]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:47:24 compute-1 sudo[95474]: pam_unix(sudo:session): session closed for user root
Jan 26 14:47:24 compute-1 sudo[95585]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqfhiwgebpvoajrsgczsbdfzksjyhbuk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438842.9934053-1134-201096403722454/AnsiballZ_systemd.py'
Jan 26 14:47:24 compute-1 sudo[95585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:47:25 compute-1 python3.9[95587]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 14:47:25 compute-1 systemd[1]: Reloading.
Jan 26 14:47:25 compute-1 systemd-sysv-generator[95617]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 14:47:25 compute-1 systemd-rc-local-generator[95614]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:47:25 compute-1 systemd[1]: Starting ovn_controller container...
Jan 26 14:47:25 compute-1 systemd[1]: Created slice Virtual Machine and Container Slice.
Jan 26 14:47:25 compute-1 systemd[1]: Started libcrun container.
Jan 26 14:47:25 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/78f2bfbd907a76c143fb8c74d36a54a18586c69c2560c144502d0b6ec699d5ed/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Jan 26 14:47:25 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1.
Jan 26 14:47:25 compute-1 podman[95628]: 2026-01-26 14:47:25.551298701 +0000 UTC m=+0.154635459 container init b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 26 14:47:25 compute-1 ovn_controller[95641]: + sudo -E kolla_set_configs
Jan 26 14:47:25 compute-1 podman[95628]: 2026-01-26 14:47:25.574510317 +0000 UTC m=+0.177847075 container start b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 26 14:47:25 compute-1 edpm-start-podman-container[95628]: ovn_controller
Jan 26 14:47:25 compute-1 systemd[1]: Created slice User Slice of UID 0.
Jan 26 14:47:25 compute-1 systemd[1]: Starting User Runtime Directory /run/user/0...
Jan 26 14:47:25 compute-1 systemd[1]: Finished User Runtime Directory /run/user/0.
Jan 26 14:47:25 compute-1 systemd[1]: Starting User Manager for UID 0...
Jan 26 14:47:25 compute-1 systemd[95680]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Jan 26 14:47:25 compute-1 edpm-start-podman-container[95627]: Creating additional drop-in dependency for "ovn_controller" (b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1)
Jan 26 14:47:25 compute-1 systemd[1]: Reloading.
Jan 26 14:47:25 compute-1 podman[95647]: 2026-01-26 14:47:25.696113448 +0000 UTC m=+0.107749384 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Jan 26 14:47:25 compute-1 systemd-rc-local-generator[95730]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:47:25 compute-1 systemd-sysv-generator[95733]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 14:47:25 compute-1 systemd[95680]: Queued start job for default target Main User Target.
Jan 26 14:47:25 compute-1 systemd[95680]: Created slice User Application Slice.
Jan 26 14:47:25 compute-1 systemd[95680]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Jan 26 14:47:25 compute-1 systemd[95680]: Started Daily Cleanup of User's Temporary Directories.
Jan 26 14:47:25 compute-1 systemd[95680]: Reached target Paths.
Jan 26 14:47:25 compute-1 systemd[95680]: Reached target Timers.
Jan 26 14:47:25 compute-1 systemd[95680]: Starting D-Bus User Message Bus Socket...
Jan 26 14:47:25 compute-1 systemd[95680]: Starting Create User's Volatile Files and Directories...
Jan 26 14:47:25 compute-1 systemd[95680]: Finished Create User's Volatile Files and Directories.
Jan 26 14:47:25 compute-1 systemd[95680]: Listening on D-Bus User Message Bus Socket.
Jan 26 14:47:25 compute-1 systemd[95680]: Reached target Sockets.
Jan 26 14:47:25 compute-1 systemd[95680]: Reached target Basic System.
Jan 26 14:47:25 compute-1 systemd[95680]: Reached target Main User Target.
Jan 26 14:47:25 compute-1 systemd[95680]: Startup finished in 122ms.
Jan 26 14:47:25 compute-1 sshd-session[91606]: error: maximum authentication attempts exceeded for root from 185.246.128.170 port 2270 ssh2 [preauth]
Jan 26 14:47:25 compute-1 sshd-session[91606]: Disconnecting authenticating user root 185.246.128.170 port 2270: Too many authentication failures [preauth]
Jan 26 14:47:25 compute-1 systemd[1]: Started User Manager for UID 0.
Jan 26 14:47:25 compute-1 systemd[1]: Started ovn_controller container.
Jan 26 14:47:25 compute-1 systemd[1]: b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1-3c8eb2fd5ec69c88.service: Main process exited, code=exited, status=1/FAILURE
Jan 26 14:47:25 compute-1 systemd[1]: b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1-3c8eb2fd5ec69c88.service: Failed with result 'exit-code'.
Jan 26 14:47:25 compute-1 systemd[1]: Started Session c1 of User root.
Jan 26 14:47:25 compute-1 sudo[95585]: pam_unix(sudo:session): session closed for user root
Jan 26 14:47:25 compute-1 ovn_controller[95641]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 26 14:47:25 compute-1 ovn_controller[95641]: INFO:__main__:Validating config file
Jan 26 14:47:25 compute-1 ovn_controller[95641]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 26 14:47:25 compute-1 ovn_controller[95641]: INFO:__main__:Writing out command to execute
Jan 26 14:47:26 compute-1 systemd[1]: session-c1.scope: Deactivated successfully.
Jan 26 14:47:26 compute-1 ovn_controller[95641]: ++ cat /run_command
Jan 26 14:47:26 compute-1 ovn_controller[95641]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 26 14:47:26 compute-1 ovn_controller[95641]: + ARGS=
Jan 26 14:47:26 compute-1 ovn_controller[95641]: + sudo kolla_copy_cacerts
Jan 26 14:47:26 compute-1 systemd[1]: Started Session c2 of User root.
Jan 26 14:47:26 compute-1 systemd[1]: session-c2.scope: Deactivated successfully.
Jan 26 14:47:26 compute-1 ovn_controller[95641]: + [[ ! -n '' ]]
Jan 26 14:47:26 compute-1 ovn_controller[95641]: + . kolla_extend_start
Jan 26 14:47:26 compute-1 ovn_controller[95641]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 26 14:47:26 compute-1 ovn_controller[95641]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Jan 26 14:47:26 compute-1 ovn_controller[95641]: + umask 0022
Jan 26 14:47:26 compute-1 ovn_controller[95641]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Jan 26 14:47:26 compute-1 ovn_controller[95641]: 2026-01-26T14:47:26Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 26 14:47:26 compute-1 ovn_controller[95641]: 2026-01-26T14:47:26Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 26 14:47:26 compute-1 ovn_controller[95641]: 2026-01-26T14:47:26Z|00003|main|INFO|OVN internal version is : [24.09.4-20.37.0-77.8]
Jan 26 14:47:26 compute-1 ovn_controller[95641]: 2026-01-26T14:47:26Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Jan 26 14:47:26 compute-1 ovn_controller[95641]: 2026-01-26T14:47:26Z|00005|stream_ssl|ERR|ssl:ovsdbserver-sb.openstack.svc:6642: connect: Address family not supported by protocol
Jan 26 14:47:26 compute-1 ovn_controller[95641]: 2026-01-26T14:47:26Z|00006|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 26 14:47:26 compute-1 ovn_controller[95641]: 2026-01-26T14:47:26Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Address family not supported by protocol)
Jan 26 14:47:26 compute-1 ovn_controller[95641]: 2026-01-26T14:47:26Z|00008|main|INFO|OVNSB IDL reconnected, force recompute.
Jan 26 14:47:26 compute-1 ovn_controller[95641]: 2026-01-26T14:47:26Z|00009|ovn_util|INFO|statctrl: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Jan 26 14:47:26 compute-1 ovn_controller[95641]: 2026-01-26T14:47:26Z|00010|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 26 14:47:26 compute-1 ovn_controller[95641]: 2026-01-26T14:47:26Z|00011|rconn|WARN|unix:/var/run/openvswitch/br-int.mgmt: connection failed (No such file or directory)
Jan 26 14:47:26 compute-1 ovn_controller[95641]: 2026-01-26T14:47:26Z|00012|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: waiting 1 seconds before reconnect
Jan 26 14:47:26 compute-1 ovn_controller[95641]: 2026-01-26T14:47:26Z|00013|ovn_util|INFO|pinctrl: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Jan 26 14:47:26 compute-1 ovn_controller[95641]: 2026-01-26T14:47:26Z|00014|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 26 14:47:26 compute-1 ovn_controller[95641]: 2026-01-26T14:47:26Z|00015|rconn|WARN|unix:/var/run/openvswitch/br-int.mgmt: connection failed (No such file or directory)
Jan 26 14:47:26 compute-1 ovn_controller[95641]: 2026-01-26T14:47:26Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: waiting 1 seconds before reconnect
Jan 26 14:47:26 compute-1 NetworkManager[55716]: <info>  [1769438846.0861] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Jan 26 14:47:26 compute-1 NetworkManager[55716]: <info>  [1769438846.0866] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 14:47:26 compute-1 NetworkManager[55716]: <warn>  [1769438846.0868] device (br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 26 14:47:26 compute-1 NetworkManager[55716]: <info>  [1769438846.0873] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Jan 26 14:47:26 compute-1 NetworkManager[55716]: <info>  [1769438846.0877] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Jan 26 14:47:26 compute-1 NetworkManager[55716]: <info>  [1769438846.0879] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 26 14:47:26 compute-1 kernel: br-int: entered promiscuous mode
Jan 26 14:47:26 compute-1 systemd-udevd[95779]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 14:47:26 compute-1 python3.9[95907]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 26 14:47:27 compute-1 ovn_controller[95641]: 2026-01-26T14:47:27Z|00001|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 26 14:47:27 compute-1 ovn_controller[95641]: 2026-01-26T14:47:27Z|00001|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 26 14:47:27 compute-1 ovn_controller[95641]: 2026-01-26T14:47:27Z|00017|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 26 14:47:27 compute-1 ovn_controller[95641]: 2026-01-26T14:47:27Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 26 14:47:27 compute-1 ovn_controller[95641]: 2026-01-26T14:47:27Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 26 14:47:27 compute-1 ovn_controller[95641]: 2026-01-26T14:47:27Z|00018|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 26 14:47:27 compute-1 ovn_controller[95641]: 2026-01-26T14:47:27Z|00019|ovn_util|INFO|features: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Jan 26 14:47:27 compute-1 ovn_controller[95641]: 2026-01-26T14:47:27Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 26 14:47:27 compute-1 ovn_controller[95641]: 2026-01-26T14:47:27Z|00021|features|INFO|OVS Feature: ct_zero_snat, state: supported
Jan 26 14:47:27 compute-1 ovn_controller[95641]: 2026-01-26T14:47:27Z|00022|features|INFO|OVS Feature: ct_flush, state: supported
Jan 26 14:47:27 compute-1 ovn_controller[95641]: 2026-01-26T14:47:27Z|00023|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Jan 26 14:47:27 compute-1 ovn_controller[95641]: 2026-01-26T14:47:27Z|00024|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 26 14:47:27 compute-1 ovn_controller[95641]: 2026-01-26T14:47:27Z|00025|main|INFO|OVS feature set changed, force recompute.
Jan 26 14:47:27 compute-1 ovn_controller[95641]: 2026-01-26T14:47:27Z|00026|ovn_util|INFO|ofctrl: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Jan 26 14:47:27 compute-1 ovn_controller[95641]: 2026-01-26T14:47:27Z|00027|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 26 14:47:27 compute-1 ovn_controller[95641]: 2026-01-26T14:47:27Z|00028|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 26 14:47:27 compute-1 ovn_controller[95641]: 2026-01-26T14:47:27Z|00029|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Jan 26 14:47:27 compute-1 ovn_controller[95641]: 2026-01-26T14:47:27Z|00030|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Jan 26 14:47:27 compute-1 ovn_controller[95641]: 2026-01-26T14:47:27Z|00031|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 26 14:47:27 compute-1 ovn_controller[95641]: 2026-01-26T14:47:27Z|00032|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 26 14:47:27 compute-1 ovn_controller[95641]: 2026-01-26T14:47:27Z|00033|features|INFO|OVS Feature: meter_support, state: supported
Jan 26 14:47:27 compute-1 ovn_controller[95641]: 2026-01-26T14:47:27Z|00034|features|INFO|OVS Feature: group_support, state: supported
Jan 26 14:47:27 compute-1 ovn_controller[95641]: 2026-01-26T14:47:27Z|00035|main|INFO|OVS feature set changed, force recompute.
Jan 26 14:47:27 compute-1 ovn_controller[95641]: 2026-01-26T14:47:27Z|00036|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Jan 26 14:47:27 compute-1 ovn_controller[95641]: 2026-01-26T14:47:27Z|00037|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Jan 26 14:47:27 compute-1 NetworkManager[55716]: <info>  [1769438847.6718] manager: (ovn-3e0272-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Jan 26 14:47:27 compute-1 NetworkManager[55716]: <info>  [1769438847.6729] manager: (ovn-092f06-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/18)
Jan 26 14:47:27 compute-1 kernel: genev_sys_6081: entered promiscuous mode
Jan 26 14:47:27 compute-1 NetworkManager[55716]: <info>  [1769438847.7090] device (genev_sys_6081): carrier: link connected
Jan 26 14:47:27 compute-1 NetworkManager[55716]: <info>  [1769438847.7095] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/19)
Jan 26 14:47:27 compute-1 sudo[96060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywtnnzrdgxrqdgppvigyvdwehkpipxsm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438847.4594247-1224-65002719999635/AnsiballZ_stat.py'
Jan 26 14:47:27 compute-1 sudo[96060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:47:28 compute-1 python3.9[96062]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:47:28 compute-1 sudo[96060]: pam_unix(sudo:session): session closed for user root
Jan 26 14:47:28 compute-1 sudo[96185]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwhdyjmdpfthhndqmhyphapyjcqzvwuo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438847.4594247-1224-65002719999635/AnsiballZ_copy.py'
Jan 26 14:47:28 compute-1 sudo[96185]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:47:28 compute-1 python3.9[96187]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769438847.4594247-1224-65002719999635/.source.yaml _original_basename=.23w13019 follow=False checksum=43b33cb7078432cf5683cd1e94bee9516ee578aa backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:47:28 compute-1 sudo[96185]: pam_unix(sudo:session): session closed for user root
Jan 26 14:47:29 compute-1 sudo[96337]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctfpuysgiczspjtcdfnuuzrzpdhwvzub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438849.100352-1254-131169257577020/AnsiballZ_command.py'
Jan 26 14:47:29 compute-1 sudo[96337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:47:29 compute-1 python3.9[96339]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:47:29 compute-1 ovs-vsctl[96340]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Jan 26 14:47:29 compute-1 sudo[96337]: pam_unix(sudo:session): session closed for user root
Jan 26 14:47:30 compute-1 sudo[96490]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldhcrexldwrxblgrqguirheqmxtevfuk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438850.025319-1270-215178646971664/AnsiballZ_command.py'
Jan 26 14:47:30 compute-1 sudo[96490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:47:30 compute-1 python3.9[96492]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:47:30 compute-1 ovs-vsctl[96494]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Jan 26 14:47:30 compute-1 sudo[96490]: pam_unix(sudo:session): session closed for user root
Jan 26 14:47:31 compute-1 sudo[96645]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-clqjxhstwpkvyftcyrizoksaphcumnbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438851.13904-1298-131584302812220/AnsiballZ_command.py'
Jan 26 14:47:31 compute-1 sudo[96645]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:47:31 compute-1 python3.9[96647]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:47:31 compute-1 ovs-vsctl[96648]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Jan 26 14:47:31 compute-1 sudo[96645]: pam_unix(sudo:session): session closed for user root
Jan 26 14:47:32 compute-1 sshd-session[85158]: Connection closed by 192.168.122.30 port 40644
Jan 26 14:47:32 compute-1 sshd-session[85155]: pam_unix(sshd:session): session closed for user zuul
Jan 26 14:47:32 compute-1 systemd[1]: session-21.scope: Deactivated successfully.
Jan 26 14:47:32 compute-1 systemd[1]: session-21.scope: Consumed 48.088s CPU time.
Jan 26 14:47:32 compute-1 systemd-logind[795]: Session 21 logged out. Waiting for processes to exit.
Jan 26 14:47:32 compute-1 systemd-logind[795]: Removed session 21.
Jan 26 14:47:36 compute-1 systemd[1]: Stopping User Manager for UID 0...
Jan 26 14:47:36 compute-1 systemd[95680]: Activating special unit Exit the Session...
Jan 26 14:47:36 compute-1 systemd[95680]: Stopped target Main User Target.
Jan 26 14:47:36 compute-1 systemd[95680]: Stopped target Basic System.
Jan 26 14:47:36 compute-1 systemd[95680]: Stopped target Paths.
Jan 26 14:47:36 compute-1 systemd[95680]: Stopped target Sockets.
Jan 26 14:47:36 compute-1 systemd[95680]: Stopped target Timers.
Jan 26 14:47:36 compute-1 systemd[95680]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 26 14:47:36 compute-1 systemd[95680]: Closed D-Bus User Message Bus Socket.
Jan 26 14:47:36 compute-1 systemd[95680]: Stopped Create User's Volatile Files and Directories.
Jan 26 14:47:36 compute-1 systemd[95680]: Removed slice User Application Slice.
Jan 26 14:47:36 compute-1 systemd[95680]: Reached target Shutdown.
Jan 26 14:47:36 compute-1 systemd[95680]: Finished Exit the Session.
Jan 26 14:47:36 compute-1 systemd[95680]: Reached target Exit the Session.
Jan 26 14:47:36 compute-1 systemd[1]: user@0.service: Deactivated successfully.
Jan 26 14:47:36 compute-1 systemd[1]: Stopped User Manager for UID 0.
Jan 26 14:47:36 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/0...
Jan 26 14:47:36 compute-1 systemd[1]: run-user-0.mount: Deactivated successfully.
Jan 26 14:47:36 compute-1 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Jan 26 14:47:36 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/0.
Jan 26 14:47:36 compute-1 systemd[1]: Removed slice User Slice of UID 0.
Jan 26 14:47:38 compute-1 sshd-session[96674]: Accepted publickey for zuul from 192.168.122.30 port 56860 ssh2: ECDSA SHA256:jdvz2lyr5EyIYu+bvBKw53Euf9HIejup115smDZmKeU
Jan 26 14:47:38 compute-1 systemd-logind[795]: New session 23 of user zuul.
Jan 26 14:47:38 compute-1 systemd[1]: Started Session 23 of User zuul.
Jan 26 14:47:38 compute-1 sshd-session[96674]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 14:47:39 compute-1 python3.9[96827]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 14:47:40 compute-1 sudo[96981]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owxqfzxzloxilppipgfzvonykkxtaymx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438859.8643425-44-56186933538927/AnsiballZ_file.py'
Jan 26 14:47:40 compute-1 sudo[96981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:47:40 compute-1 python3.9[96983]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/openstack/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:47:40 compute-1 sudo[96981]: pam_unix(sudo:session): session closed for user root
Jan 26 14:47:41 compute-1 sudo[97133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypjsvfnonpddfdfltdpkaidowwjmsxrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438860.702302-44-97363410783797/AnsiballZ_file.py'
Jan 26 14:47:41 compute-1 sudo[97133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:47:41 compute-1 python3.9[97135]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:47:41 compute-1 sudo[97133]: pam_unix(sudo:session): session closed for user root
Jan 26 14:47:41 compute-1 sudo[97285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjndacdxbvjhjrzuovaatvyxefmaplmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438861.472088-44-58880120259557/AnsiballZ_file.py'
Jan 26 14:47:41 compute-1 sudo[97285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:47:42 compute-1 python3.9[97287]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:47:42 compute-1 sudo[97285]: pam_unix(sudo:session): session closed for user root
Jan 26 14:47:42 compute-1 sudo[97437]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efseaaxfuxauqvtkpwmryhzxlccmwqzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438862.237647-44-7298297268492/AnsiballZ_file.py'
Jan 26 14:47:42 compute-1 sudo[97437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:47:42 compute-1 python3.9[97439]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:47:42 compute-1 sudo[97437]: pam_unix(sudo:session): session closed for user root
Jan 26 14:47:43 compute-1 sshd-session[96063]: error: maximum authentication attempts exceeded for root from 185.246.128.170 port 48854 ssh2 [preauth]
Jan 26 14:47:43 compute-1 sshd-session[96063]: Disconnecting authenticating user root 185.246.128.170 port 48854: Too many authentication failures [preauth]
Jan 26 14:47:43 compute-1 sudo[97589]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdrekeoxfcbkinwhudsxxwtdewqwkqeb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438862.9086342-44-134910611362655/AnsiballZ_file.py'
Jan 26 14:47:43 compute-1 sudo[97589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:47:43 compute-1 python3.9[97591]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:47:43 compute-1 sudo[97589]: pam_unix(sudo:session): session closed for user root
Jan 26 14:47:44 compute-1 python3.9[97741]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 14:47:44 compute-1 sudo[97892]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ppneurntqzzmgfmsyfvadctngigdbeyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438864.4431968-132-49310405623667/AnsiballZ_seboolean.py'
Jan 26 14:47:44 compute-1 sudo[97892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:47:45 compute-1 python3.9[97894]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 26 14:47:45 compute-1 sudo[97892]: pam_unix(sudo:session): session closed for user root
Jan 26 14:47:46 compute-1 python3.9[98046]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:47:47 compute-1 python3.9[98167]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769438865.9159937-148-149186548708064/.source follow=False _original_basename=haproxy.j2 checksum=a26cf614e6cc6b26d29977e50419effca5f0a51f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:47:47 compute-1 python3.9[98317]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:47:48 compute-1 python3.9[98438]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769438867.4070559-178-266178527783131/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:47:49 compute-1 sudo[98588]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvtabhivdxskszrufrdokmuenklgjggd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438868.7981126-212-103024521197115/AnsiballZ_setup.py'
Jan 26 14:47:49 compute-1 sudo[98588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:47:49 compute-1 python3.9[98590]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 14:47:49 compute-1 sudo[98588]: pam_unix(sudo:session): session closed for user root
Jan 26 14:47:50 compute-1 sudo[98672]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkxmytldyfkdfhhdplxdqfmjjroblrky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438868.7981126-212-103024521197115/AnsiballZ_dnf.py'
Jan 26 14:47:50 compute-1 sudo[98672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:47:50 compute-1 python3.9[98674]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 14:47:51 compute-1 sudo[98672]: pam_unix(sudo:session): session closed for user root
Jan 26 14:47:52 compute-1 sudo[98825]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ksiypoyurqpcdkouwelfrwxxbywxwnmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438871.8223722-236-126929650260838/AnsiballZ_systemd.py'
Jan 26 14:47:52 compute-1 sudo[98825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:47:52 compute-1 python3.9[98827]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 26 14:47:53 compute-1 sudo[98825]: pam_unix(sudo:session): session closed for user root
Jan 26 14:47:53 compute-1 python3.9[98980]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:47:54 compute-1 python3.9[99101]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769438873.2321882-252-250661071416100/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:47:54 compute-1 python3.9[99251]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:47:55 compute-1 python3.9[99372]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769438874.4632661-252-34758580665265/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:47:56 compute-1 ovn_controller[95641]: 2026-01-26T14:47:56Z|00038|memory|INFO|15948 kB peak resident set size after 30.5 seconds
Jan 26 14:47:56 compute-1 ovn_controller[95641]: 2026-01-26T14:47:56Z|00039|memory|INFO|idl-cells-OVN_Southbound:256 idl-cells-Open_vSwitch:528 ofctrl_desired_flow_usage-KB:6 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:2
Jan 26 14:47:56 compute-1 podman[99496]: 2026-01-26 14:47:56.601858984 +0000 UTC m=+0.093878195 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 26 14:47:56 compute-1 python3.9[99535]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:47:57 compute-1 python3.9[99669]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769438876.2643812-340-53254786937358/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:47:58 compute-1 python3.9[99819]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:47:58 compute-1 python3.9[99940]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769438877.4879403-340-53950446849543/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:47:59 compute-1 python3.9[100090]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 14:47:59 compute-1 sudo[100242]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkiihankrldqnzkftoydmkcpfthsktow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438879.7118099-416-232204212711056/AnsiballZ_file.py'
Jan 26 14:47:59 compute-1 sudo[100242]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:48:00 compute-1 python3.9[100244]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:48:00 compute-1 sudo[100242]: pam_unix(sudo:session): session closed for user root
Jan 26 14:48:00 compute-1 sudo[100394]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxbnxpuzjezoxfbybthwzqrawkdmyutw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438880.4970357-432-10805968484262/AnsiballZ_stat.py'
Jan 26 14:48:00 compute-1 sudo[100394]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:48:00 compute-1 python3.9[100396]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:48:01 compute-1 sudo[100394]: pam_unix(sudo:session): session closed for user root
Jan 26 14:48:01 compute-1 sshd-session[97742]: error: maximum authentication attempts exceeded for root from 185.246.128.170 port 1261 ssh2 [preauth]
Jan 26 14:48:01 compute-1 sshd-session[97742]: Disconnecting authenticating user root 185.246.128.170 port 1261: Too many authentication failures [preauth]
Jan 26 14:48:01 compute-1 sudo[100472]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vcsavxdmfkjgfswvdijeoahydmtucsdm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438880.4970357-432-10805968484262/AnsiballZ_file.py'
Jan 26 14:48:01 compute-1 sudo[100472]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:48:01 compute-1 python3.9[100474]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:48:01 compute-1 sudo[100472]: pam_unix(sudo:session): session closed for user root
Jan 26 14:48:01 compute-1 sudo[100624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pffbkkqzjcczgzipaczjwvaokztmprwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438881.6024647-432-260448467111873/AnsiballZ_stat.py'
Jan 26 14:48:01 compute-1 sudo[100624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:48:02 compute-1 python3.9[100626]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:48:02 compute-1 sudo[100624]: pam_unix(sudo:session): session closed for user root
Jan 26 14:48:02 compute-1 sudo[100702]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjwlattiioqtczphcyzuyyhnkdsakkqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438881.6024647-432-260448467111873/AnsiballZ_file.py'
Jan 26 14:48:02 compute-1 sudo[100702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:48:02 compute-1 python3.9[100704]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:48:02 compute-1 sudo[100702]: pam_unix(sudo:session): session closed for user root
Jan 26 14:48:03 compute-1 sudo[100854]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shhiiwqdmjfbahywbvjwljfxxndifvbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438882.8531907-478-57736258067471/AnsiballZ_file.py'
Jan 26 14:48:03 compute-1 sudo[100854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:48:03 compute-1 python3.9[100856]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:48:03 compute-1 sudo[100854]: pam_unix(sudo:session): session closed for user root
Jan 26 14:48:03 compute-1 sudo[101006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-epumtrygmbvjanunnqrpvjqzpazxqdew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438883.5504882-494-144225327623551/AnsiballZ_stat.py'
Jan 26 14:48:03 compute-1 sudo[101006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:48:04 compute-1 python3.9[101008]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:48:04 compute-1 sudo[101006]: pam_unix(sudo:session): session closed for user root
Jan 26 14:48:04 compute-1 sudo[101084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajfykwjpsaapyzsppvcsescfztqddnmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438883.5504882-494-144225327623551/AnsiballZ_file.py'
Jan 26 14:48:04 compute-1 sudo[101084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:48:04 compute-1 python3.9[101086]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:48:04 compute-1 sudo[101084]: pam_unix(sudo:session): session closed for user root
Jan 26 14:48:04 compute-1 sudo[101236]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxkgschmecpsomthosfitooxwuozwbup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438884.7133882-518-40877262761461/AnsiballZ_stat.py'
Jan 26 14:48:04 compute-1 sudo[101236]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:48:05 compute-1 python3.9[101238]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:48:05 compute-1 sudo[101236]: pam_unix(sudo:session): session closed for user root
Jan 26 14:48:05 compute-1 sudo[101314]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ergqggyoncyolizqmrjfgsmszxyrkyjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438884.7133882-518-40877262761461/AnsiballZ_file.py'
Jan 26 14:48:05 compute-1 sudo[101314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:48:05 compute-1 python3.9[101316]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:48:05 compute-1 sudo[101314]: pam_unix(sudo:session): session closed for user root
Jan 26 14:48:06 compute-1 sudo[101466]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgzakalhaxvfxgpxxxlxyvyrohxchwsu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438885.9693563-542-65151582240623/AnsiballZ_systemd.py'
Jan 26 14:48:06 compute-1 sudo[101466]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:48:06 compute-1 python3.9[101468]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 14:48:06 compute-1 systemd[1]: Reloading.
Jan 26 14:48:06 compute-1 systemd-rc-local-generator[101493]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:48:06 compute-1 systemd-sysv-generator[101498]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 14:48:06 compute-1 sudo[101466]: pam_unix(sudo:session): session closed for user root
Jan 26 14:48:07 compute-1 sudo[101656]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ucgzgjolabkdyuwsugxupyjrryiajeeg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438887.1071668-558-233880441246004/AnsiballZ_stat.py'
Jan 26 14:48:07 compute-1 sudo[101656]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:48:07 compute-1 python3.9[101658]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:48:07 compute-1 sudo[101656]: pam_unix(sudo:session): session closed for user root
Jan 26 14:48:07 compute-1 sudo[101735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tidfzzcldvkkjjknkgusoddueuocfoss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438887.1071668-558-233880441246004/AnsiballZ_file.py'
Jan 26 14:48:07 compute-1 sudo[101735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:48:08 compute-1 python3.9[101737]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:48:08 compute-1 sudo[101735]: pam_unix(sudo:session): session closed for user root
Jan 26 14:48:08 compute-1 sudo[101887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffxtlxnvplytboakknadbgjijibuekmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438888.3136675-582-197239886659215/AnsiballZ_stat.py'
Jan 26 14:48:08 compute-1 sudo[101887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:48:08 compute-1 python3.9[101889]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:48:08 compute-1 sudo[101887]: pam_unix(sudo:session): session closed for user root
Jan 26 14:48:09 compute-1 sudo[101965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqlfzxfdtvktwwknzlafscgzsamjqhbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438888.3136675-582-197239886659215/AnsiballZ_file.py'
Jan 26 14:48:09 compute-1 sudo[101965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:48:10 compute-1 python3.9[101967]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:48:10 compute-1 sudo[101965]: pam_unix(sudo:session): session closed for user root
Jan 26 14:48:10 compute-1 sudo[102117]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhmarzsxbzdnmifhvovikedshzpsrtyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438890.3998673-606-246828189466786/AnsiballZ_systemd.py'
Jan 26 14:48:10 compute-1 sudo[102117]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:48:11 compute-1 python3.9[102119]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 14:48:11 compute-1 systemd[1]: Reloading.
Jan 26 14:48:11 compute-1 systemd-rc-local-generator[102142]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:48:11 compute-1 systemd-sysv-generator[102145]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 14:48:11 compute-1 systemd[1]: Starting Create netns directory...
Jan 26 14:48:11 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 26 14:48:11 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 26 14:48:11 compute-1 systemd[1]: Finished Create netns directory.
Jan 26 14:48:11 compute-1 sudo[102117]: pam_unix(sudo:session): session closed for user root
Jan 26 14:48:11 compute-1 sudo[102311]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwjulwmbiozvsbxypaozgwkkigicrfbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438891.6818352-626-22567213843490/AnsiballZ_file.py'
Jan 26 14:48:11 compute-1 sudo[102311]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:48:12 compute-1 python3.9[102313]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:48:12 compute-1 sudo[102311]: pam_unix(sudo:session): session closed for user root
Jan 26 14:48:12 compute-1 sudo[102465]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbfbpyamqgvgfzusfcibxxyplakdvcuw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438892.3949027-642-17202899357348/AnsiballZ_stat.py'
Jan 26 14:48:12 compute-1 sudo[102465]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:48:12 compute-1 sshd-session[102437]: Connection closed by 52.90.163.28 port 60182 [preauth]
Jan 26 14:48:12 compute-1 python3.9[102467]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:48:12 compute-1 sudo[102465]: pam_unix(sudo:session): session closed for user root
Jan 26 14:48:13 compute-1 sudo[102588]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qftjdjrlsfgdmggwtgpyyrzzjhofbtab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438892.3949027-642-17202899357348/AnsiballZ_copy.py'
Jan 26 14:48:13 compute-1 sudo[102588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:48:13 compute-1 python3.9[102590]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769438892.3949027-642-17202899357348/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:48:13 compute-1 sudo[102588]: pam_unix(sudo:session): session closed for user root
Jan 26 14:48:14 compute-1 sudo[102740]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdhqofmdejdokrioongdodukkukxzbvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438893.8209343-676-23842951557104/AnsiballZ_file.py'
Jan 26 14:48:14 compute-1 sudo[102740]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:48:14 compute-1 python3.9[102742]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:48:14 compute-1 sudo[102740]: pam_unix(sudo:session): session closed for user root
Jan 26 14:48:14 compute-1 sudo[102892]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-habnccfaktexmjtppjilnknqrifxjeaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438894.5574858-692-107262954322774/AnsiballZ_file.py'
Jan 26 14:48:14 compute-1 sudo[102892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:48:15 compute-1 python3.9[102894]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:48:15 compute-1 sudo[102892]: pam_unix(sudo:session): session closed for user root
Jan 26 14:48:15 compute-1 sudo[103044]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvvluiweavzdzothjsgwymsrvzwapjqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438895.3234177-708-98774262714978/AnsiballZ_stat.py'
Jan 26 14:48:15 compute-1 sudo[103044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:48:15 compute-1 python3.9[103046]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:48:15 compute-1 sudo[103044]: pam_unix(sudo:session): session closed for user root
Jan 26 14:48:16 compute-1 sudo[103167]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovbkbnarklnsrkvydtrfkdcefgoehaey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438895.3234177-708-98774262714978/AnsiballZ_copy.py'
Jan 26 14:48:16 compute-1 sudo[103167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:48:16 compute-1 python3.9[103169]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769438895.3234177-708-98774262714978/.source.json _original_basename=.52vhyxm5 follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:48:16 compute-1 sudo[103167]: pam_unix(sudo:session): session closed for user root
Jan 26 14:48:17 compute-1 python3.9[103319]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:48:19 compute-1 sudo[103740]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iecizeavldoayhuluekfpafjexhetlpm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438898.9482973-789-235445804758704/AnsiballZ_container_config_data.py'
Jan 26 14:48:19 compute-1 sudo[103740]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:48:19 compute-1 python3.9[103742]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Jan 26 14:48:19 compute-1 sudo[103740]: pam_unix(sudo:session): session closed for user root
Jan 26 14:48:20 compute-1 sudo[103892]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjgkhtcsruyhntgkxuumjeqxxnlwelhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438899.94164-810-228970537893471/AnsiballZ_container_config_hash.py'
Jan 26 14:48:20 compute-1 sudo[103892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:48:20 compute-1 python3.9[103894]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 26 14:48:20 compute-1 sudo[103892]: pam_unix(sudo:session): session closed for user root
Jan 26 14:48:21 compute-1 sudo[104044]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwxvkgxfqbqyojgpucglabojvksbfjhy ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769438901.0316677-830-90998330751094/AnsiballZ_edpm_container_manage.py'
Jan 26 14:48:21 compute-1 sudo[104044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:48:21 compute-1 python3[104046]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json containers=['ovn_metadata_agent'] log_base_path=/var/log/containers/stdouts debug=False
Jan 26 14:48:22 compute-1 podman[104084]: 2026-01-26 14:48:21.978270634 +0000 UTC m=+0.024026056 image pull d5bf96c5225682608353c2a38183b39c74c7c48343b54a579b3b6f3d81996637 38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Jan 26 14:48:22 compute-1 podman[104084]: 2026-01-26 14:48:22.145726569 +0000 UTC m=+0.191482011 container create bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 26 14:48:22 compute-1 python3[104046]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z 38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Jan 26 14:48:22 compute-1 sudo[104044]: pam_unix(sudo:session): session closed for user root
Jan 26 14:48:22 compute-1 sudo[104272]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdsxaiwezeoktmtmubrmxwlxwxcebwoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438902.505582-846-100334130715799/AnsiballZ_stat.py'
Jan 26 14:48:22 compute-1 sudo[104272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:48:23 compute-1 python3.9[104274]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 14:48:23 compute-1 sudo[104272]: pam_unix(sudo:session): session closed for user root
Jan 26 14:48:23 compute-1 sudo[104426]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wiycquvkymhyoooiphdcefizyanthjnw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438903.305828-864-118342175294912/AnsiballZ_file.py'
Jan 26 14:48:23 compute-1 sudo[104426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:48:23 compute-1 python3.9[104428]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:48:23 compute-1 sudo[104426]: pam_unix(sudo:session): session closed for user root
Jan 26 14:48:24 compute-1 sudo[104502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpknqiqdcoefkalclfxnalszluwhbswp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438903.305828-864-118342175294912/AnsiballZ_stat.py'
Jan 26 14:48:24 compute-1 sudo[104502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:48:24 compute-1 python3.9[104504]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 14:48:24 compute-1 sudo[104502]: pam_unix(sudo:session): session closed for user root
Jan 26 14:48:24 compute-1 sudo[104653]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qgaucjfrfjlvqjdmbuezhnpaydqatocs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438904.2931812-864-274425681446739/AnsiballZ_copy.py'
Jan 26 14:48:24 compute-1 sudo[104653]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:48:24 compute-1 python3.9[104655]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769438904.2931812-864-274425681446739/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:48:24 compute-1 sudo[104653]: pam_unix(sudo:session): session closed for user root
Jan 26 14:48:25 compute-1 sudo[104729]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwixusyrkbzxxuswwrjhqttymwlpyfcb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438904.2931812-864-274425681446739/AnsiballZ_systemd.py'
Jan 26 14:48:25 compute-1 sudo[104729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:48:25 compute-1 python3.9[104731]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 14:48:25 compute-1 systemd[1]: Reloading.
Jan 26 14:48:25 compute-1 systemd-sysv-generator[104761]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 14:48:25 compute-1 systemd-rc-local-generator[104757]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:48:25 compute-1 sudo[104729]: pam_unix(sudo:session): session closed for user root
Jan 26 14:48:26 compute-1 sudo[104840]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqqzbtaetoouuthcpceirbgqflzvbmyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438904.2931812-864-274425681446739/AnsiballZ_systemd.py'
Jan 26 14:48:26 compute-1 sudo[104840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:48:26 compute-1 python3.9[104842]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 14:48:26 compute-1 systemd[1]: Reloading.
Jan 26 14:48:26 compute-1 systemd-sysv-generator[104895]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 14:48:26 compute-1 systemd-rc-local-generator[104890]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:48:26 compute-1 podman[104846]: 2026-01-26 14:48:26.731285587 +0000 UTC m=+0.092698934 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 26 14:48:26 compute-1 systemd[1]: Starting ovn_metadata_agent container...
Jan 26 14:48:27 compute-1 systemd[1]: Started libcrun container.
Jan 26 14:48:27 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6e96476ee06386197d5c866e755e251863abbe10a80603cf59d20ed3bcd9461f/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Jan 26 14:48:27 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6e96476ee06386197d5c866e755e251863abbe10a80603cf59d20ed3bcd9461f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 14:48:27 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20.
Jan 26 14:48:27 compute-1 podman[104909]: 2026-01-26 14:48:27.049491302 +0000 UTC m=+0.131778439 container init bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2)
Jan 26 14:48:27 compute-1 ovn_metadata_agent[104924]: + sudo -E kolla_set_configs
Jan 26 14:48:27 compute-1 podman[104909]: 2026-01-26 14:48:27.075673329 +0000 UTC m=+0.157960456 container start bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 26 14:48:27 compute-1 edpm-start-podman-container[104909]: ovn_metadata_agent
Jan 26 14:48:27 compute-1 ovn_metadata_agent[104924]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 26 14:48:27 compute-1 ovn_metadata_agent[104924]: INFO:__main__:Validating config file
Jan 26 14:48:27 compute-1 ovn_metadata_agent[104924]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 26 14:48:27 compute-1 ovn_metadata_agent[104924]: INFO:__main__:Copying service configuration files
Jan 26 14:48:27 compute-1 ovn_metadata_agent[104924]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Jan 26 14:48:27 compute-1 ovn_metadata_agent[104924]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Jan 26 14:48:27 compute-1 ovn_metadata_agent[104924]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Jan 26 14:48:27 compute-1 ovn_metadata_agent[104924]: INFO:__main__:Writing out command to execute
Jan 26 14:48:27 compute-1 ovn_metadata_agent[104924]: INFO:__main__:Setting permission for /var/lib/neutron
Jan 26 14:48:27 compute-1 ovn_metadata_agent[104924]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Jan 26 14:48:27 compute-1 ovn_metadata_agent[104924]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Jan 26 14:48:27 compute-1 ovn_metadata_agent[104924]: INFO:__main__:Setting permission for /var/lib/neutron/external
Jan 26 14:48:27 compute-1 ovn_metadata_agent[104924]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Jan 26 14:48:27 compute-1 ovn_metadata_agent[104924]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Jan 26 14:48:27 compute-1 ovn_metadata_agent[104924]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Jan 26 14:48:27 compute-1 edpm-start-podman-container[104908]: Creating additional drop-in dependency for "ovn_metadata_agent" (bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20)
Jan 26 14:48:27 compute-1 podman[104932]: 2026-01-26 14:48:27.146270678 +0000 UTC m=+0.051086253 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 26 14:48:27 compute-1 ovn_metadata_agent[104924]: ++ cat /run_command
Jan 26 14:48:27 compute-1 ovn_metadata_agent[104924]: + CMD=neutron-ovn-metadata-agent
Jan 26 14:48:27 compute-1 ovn_metadata_agent[104924]: + ARGS=
Jan 26 14:48:27 compute-1 ovn_metadata_agent[104924]: + sudo kolla_copy_cacerts
Jan 26 14:48:27 compute-1 systemd[1]: Reloading.
Jan 26 14:48:27 compute-1 ovn_metadata_agent[104924]: Running command: 'neutron-ovn-metadata-agent'
Jan 26 14:48:27 compute-1 ovn_metadata_agent[104924]: + [[ ! -n '' ]]
Jan 26 14:48:27 compute-1 ovn_metadata_agent[104924]: + . kolla_extend_start
Jan 26 14:48:27 compute-1 ovn_metadata_agent[104924]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Jan 26 14:48:27 compute-1 ovn_metadata_agent[104924]: + umask 0022
Jan 26 14:48:27 compute-1 ovn_metadata_agent[104924]: + exec neutron-ovn-metadata-agent
Jan 26 14:48:27 compute-1 systemd-rc-local-generator[105001]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:48:27 compute-1 systemd-sysv-generator[105005]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 14:48:27 compute-1 systemd[1]: Started ovn_metadata_agent container.
Jan 26 14:48:27 compute-1 sudo[104840]: pam_unix(sudo:session): session closed for user root
Jan 26 14:48:28 compute-1 python3.9[105158]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.954 104930 INFO neutron.common.config [-] Logging enabled!
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.954 104930 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 26.1.0.dev268
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.954 104930 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.12/site-packages/neutron/common/config.py:124
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.955 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2804
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.955 104930 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2805
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.955 104930 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2806
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.955 104930 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2807
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.955 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2809
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.955 104930 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.955 104930 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.955 104930 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.955 104930 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.956 104930 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.956 104930 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.956 104930 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.956 104930 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.956 104930 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.956 104930 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.956 104930 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.956 104930 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.956 104930 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.956 104930 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.956 104930 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.956 104930 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.956 104930 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.957 104930 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.957 104930 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.957 104930 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.957 104930 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.957 104930 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.957 104930 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.957 104930 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.957 104930 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.957 104930 DEBUG neutron.agent.ovn.metadata_agent [-] enable_signals                 = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.957 104930 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.957 104930 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.957 104930 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.958 104930 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.958 104930 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.958 104930 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.958 104930 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.958 104930 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.958 104930 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.958 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.958 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.958 104930 DEBUG neutron.agent.ovn.metadata_agent [-] log_color                      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.958 104930 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.958 104930 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.958 104930 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.959 104930 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.959 104930 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.959 104930 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.959 104930 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.959 104930 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.959 104930 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.959 104930 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.959 104930 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.959 104930 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.959 104930 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.959 104930 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.959 104930 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.959 104930 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.959 104930 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.959 104930 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.960 104930 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.960 104930 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.960 104930 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.960 104930 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.960 104930 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.960 104930 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.960 104930 DEBUG neutron.agent.ovn.metadata_agent [-] my_ip                          = 38.102.83.217 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.960 104930 DEBUG neutron.agent.ovn.metadata_agent [-] my_ipv6                        = ::1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.960 104930 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.960 104930 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.960 104930 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.960 104930 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.960 104930 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.961 104930 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.961 104930 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.961 104930 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.961 104930 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.961 104930 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.961 104930 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.961 104930 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.961 104930 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.961 104930 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.961 104930 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.961 104930 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.962 104930 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.962 104930 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.962 104930 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.962 104930 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.962 104930 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.962 104930 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.962 104930 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.962 104930 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.962 104930 DEBUG neutron.agent.ovn.metadata_agent [-] shell_completion               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.962 104930 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.962 104930 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.962 104930 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.962 104930 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.963 104930 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.963 104930 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.963 104930 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.963 104930 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.963 104930 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.963 104930 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_qinq                      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.963 104930 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.963 104930 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.963 104930 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.963 104930 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.963 104930 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.963 104930 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.963 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.963 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.964 104930 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.964 104930 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.964 104930 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.964 104930 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.964 104930 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.964 104930 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.964 104930 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.964 104930 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.964 104930 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.964 104930 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_requests        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.964 104930 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.964 104930 DEBUG neutron.agent.ovn.metadata_agent [-] profiler_jaeger.process_tags   = {} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.964 104930 DEBUG neutron.agent.ovn.metadata_agent [-] profiler_jaeger.service_name_prefix = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.965 104930 DEBUG neutron.agent.ovn.metadata_agent [-] profiler_otlp.service_name_prefix = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.965 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.965 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.965 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.965 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.965 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.965 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.965 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.965 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.965 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.966 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.966 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.966 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.966 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.966 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.966 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.966 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_timeout     = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.966 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.966 104930 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.966 104930 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.966 104930 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.967 104930 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.967 104930 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.log_daemon_traceback   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.967 104930 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.967 104930 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.967 104930 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.967 104930 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.967 104930 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.967 104930 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.967 104930 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.967 104930 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.967 104930 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.968 104930 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.968 104930 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.968 104930 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.968 104930 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.968 104930 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.968 104930 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.968 104930 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.968 104930 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.968 104930 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.968 104930 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.968 104930 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.969 104930 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.969 104930 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.969 104930 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.969 104930 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.969 104930 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.969 104930 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.969 104930 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.969 104930 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.969 104930 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.969 104930 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.969 104930 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.970 104930 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.970 104930 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.970 104930 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.970 104930 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.970 104930 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.970 104930 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.970 104930 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.970 104930 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.970 104930 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.971 104930 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.971 104930 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.971 104930 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.971 104930 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.971 104930 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.971 104930 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.971 104930 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.971 104930 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.971 104930 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mappings            = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.971 104930 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.datapath_type              = system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.972 104930 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_flood                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.972 104930 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_flood_reports         = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.972 104930 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_flood_unregistered    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.972 104930 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.972 104930 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.int_peer_patch_port        = patch-tun log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.972 104930 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.integration_bridge         = br-int log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.972 104930 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.local_ip                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.972 104930 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_connect_timeout         = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.972 104930 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_inactivity_probe        = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.972 104930 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_listen_address          = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.972 104930 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_listen_port             = 6633 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.973 104930 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_request_timeout         = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.973 104930 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.openflow_processed_per_port = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.973 104930 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.973 104930 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_debug                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.973 104930 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.973 104930 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.qos_meter_bandwidth        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.973 104930 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_bandwidths = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.973 104930 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_default_hypervisor = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.973 104930 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_hypervisors = {} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.973 104930 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_inventory_defaults = {'allocation_ratio': 1.0, 'min_unit': 1, 'step_size': 1, 'reserved': 0} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.974 104930 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_packet_processing_inventory_defaults = {'allocation_ratio': 1.0, 'min_unit': 1, 'step_size': 1, 'reserved': 0} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.974 104930 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_packet_processing_with_direction = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.974 104930 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_packet_processing_without_direction = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.974 104930 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ssl_ca_cert_file           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.974 104930 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ssl_cert_file              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.974 104930 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ssl_key_file               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.974 104930 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.tun_peer_patch_port        = patch-int log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.974 104930 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.tunnel_bridge              = br-tun log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.974 104930 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.vhostuser_socket_dir       = /var/run/openvswitch log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.974 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.974 104930 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.975 104930 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.975 104930 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.975 104930 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.975 104930 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.975 104930 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.975 104930 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.975 104930 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.975 104930 DEBUG neutron.agent.ovn.metadata_agent [-] agent.extensions               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.975 104930 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.975 104930 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.975 104930 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.975 104930 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.976 104930 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.976 104930 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.976 104930 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.976 104930 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.976 104930 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.976 104930 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.976 104930 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.976 104930 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.976 104930 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.976 104930 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.976 104930 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.976 104930 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.977 104930 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.977 104930 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.977 104930 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.977 104930 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.977 104930 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.977 104930 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.977 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.977 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.977 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.978 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.978 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.978 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.978 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.978 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.978 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.978 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.978 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.978 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.978 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.978 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.978 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.979 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.979 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.979 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.979 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.979 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.979 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.979 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.979 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.979 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.979 104930 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.979 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.broadcast_arps_to_all_routers = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.979 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.980 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.980 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_records_ovn_owned      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.980 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.980 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.980 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.fdb_age_threshold          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.980 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.live_migration_activation_strategy = rarp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.980 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.localnet_learn_fdb         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.980 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.mac_binding_age_threshold  = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.980 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.980 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.980 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.981 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.981 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.981 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.981 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.981 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.981 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = ['tcp:127.0.0.1:6641'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.981 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.981 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_router_indirect_snat   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.981 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.981 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.982 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ['ssl:ovsdbserver-sb.openstack.svc:6642'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.982 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.982 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.982 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.982 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.982 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.982 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.982 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ovn_nb_global.fdb_removal_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.982 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ovn_nb_global.ignore_lsp_down  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.983 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ovn_nb_global.mac_binding_removal_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.983 104930 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.base_query_rate_limit = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.983 104930 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.base_window_duration = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.983 104930 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.burst_query_rate_limit = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.983 104930 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.burst_window_duration = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.983 104930 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.ip_versions = [4] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.983 104930 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.rate_limit_enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.983 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.983 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.983 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.983 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.983 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.984 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.984 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.984 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.984 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.984 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.984 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.984 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.hostname = compute-1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.984 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.984 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.984 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.984 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.985 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_splay = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.985 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.processname = neutron-ovn-metadata-agent log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.985 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.985 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.985 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.985 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.985 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.985 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.985 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.985 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.985 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.986 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.986 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_stream_fanout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.986 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.986 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_quorum_queue = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.986 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.986 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.986 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.986 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.986 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.986 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.986 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.987 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.use_queue_manager = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.987 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.987 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.987 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.987 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.987 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_reports.file_event_handler = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.987 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.987 104930 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.987 104930 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2828
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.996 104930 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.997 104930 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.997 104930 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.997 104930 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Jan 26 14:48:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:28.997 104930 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Jan 26 14:48:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:29.009 104930 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 41380b5a-e321-4ce4-bcc6-ecd563b3c793 (UUID: 41380b5a-e321-4ce4-bcc6-ecd563b3c793) and ovn bridge br-int. _load_config /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:419
Jan 26 14:48:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:29.038 104930 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Jan 26 14:48:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:29.038 104930 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Jan 26 14:48:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:29.038 104930 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Port_Binding.logical_port autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Jan 26 14:48:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:29.038 104930 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 26 14:48:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:29.038 104930 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 26 14:48:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:29.042 104930 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 26 14:48:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:29.051 104930 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 26 14:48:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:29.058 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '41380b5a-e321-4ce4-bcc6-ecd563b3c793'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], external_ids={}, name=41380b5a-e321-4ce4-bcc6-ecd563b3c793, nb_cfg_timestamp=1769438855109, nb_cfg=1) old= matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 14:48:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:29.061 104930 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpo1_hw4rk/privsep.sock']
Jan 26 14:48:29 compute-1 sudo[105314]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfcratwinqbslhrlenpjebkmsnhnsoym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438908.7512422-954-281262421910842/AnsiballZ_stat.py'
Jan 26 14:48:29 compute-1 sudo[105314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:48:29 compute-1 python3.9[105320]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:48:29 compute-1 sudo[105314]: pam_unix(sudo:session): session closed for user root
Jan 26 14:48:29 compute-1 sudo[105445]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zmwzesqtkzycyfihpayuihrmeorscfkt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438908.7512422-954-281262421910842/AnsiballZ_copy.py'
Jan 26 14:48:29 compute-1 sudo[105445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:48:29 compute-1 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Jan 26 14:48:30 compute-1 sshd-session[101506]: error: maximum authentication attempts exceeded for root from 185.246.128.170 port 5115 ssh2 [preauth]
Jan 26 14:48:30 compute-1 sshd-session[101506]: Disconnecting authenticating user root 185.246.128.170 port 5115: Too many authentication failures [preauth]
Jan 26 14:48:30 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:30.182 104930 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Jan 26 14:48:30 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:30.182 104930 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpo1_hw4rk/privsep.sock __init__ /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:377
Jan 26 14:48:30 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:29.659 105448 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 26 14:48:30 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:29.663 105448 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 26 14:48:30 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:29.665 105448 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Jan 26 14:48:30 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:29.665 105448 INFO oslo.privsep.daemon [-] privsep daemon running as pid 105448
Jan 26 14:48:30 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:30.185 105448 DEBUG oslo.privsep.daemon [-] privsep: reply[73476292-4c50-436a-82d0-14605c519ab7]: (2,) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 14:48:30 compute-1 python3.9[105447]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769438908.7512422-954-281262421910842/.source.yaml _original_basename=.r53pjlrs follow=False checksum=99a252e4b6d901576958376aa7d2a3b94805214d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:48:30 compute-1 sudo[105445]: pam_unix(sudo:session): session closed for user root
Jan 26 14:48:30 compute-1 sshd-session[96677]: Connection closed by 192.168.122.30 port 56860
Jan 26 14:48:30 compute-1 sshd-session[96674]: pam_unix(sshd:session): session closed for user zuul
Jan 26 14:48:30 compute-1 systemd[1]: session-23.scope: Deactivated successfully.
Jan 26 14:48:30 compute-1 systemd[1]: session-23.scope: Consumed 37.791s CPU time.
Jan 26 14:48:30 compute-1 systemd-logind[795]: Session 23 logged out. Waiting for processes to exit.
Jan 26 14:48:30 compute-1 systemd-logind[795]: Removed session 23.
Jan 26 14:48:30 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:30.769 105448 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 14:48:30 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:30.770 105448 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 14:48:30 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:30.770 105448 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 14:48:31 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:31.347 105448 INFO oslo_service.backend [-] Loading backend: eventlet
Jan 26 14:48:31 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:31.434 105448 INFO oslo_service.backend [-] Backend 'eventlet' successfully loaded and cached.
Jan 26 14:48:31 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:31.482 105448 DEBUG oslo.privsep.daemon [-] privsep: reply[187b4c9b-b372-42f8-ae8c-08c7a95334cf]: (4, []) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 14:48:31 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:31.483 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=41380b5a-e321-4ce4-bcc6-ecd563b3c793, column=external_ids, values=({'neutron:ovn-metadata-id': '0e3de3e4-2b0e-54fa-bd6a-3ae04ad994ec'},)) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 14:48:31 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:31.490 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=41380b5a-e321-4ce4-bcc6-ecd563b3c793, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 14:48:31 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:48:31.496 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=41380b5a-e321-4ce4-bcc6-ecd563b3c793, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '1'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 14:48:36 compute-1 sshd-session[105480]: Accepted publickey for zuul from 192.168.122.30 port 42468 ssh2: ECDSA SHA256:jdvz2lyr5EyIYu+bvBKw53Euf9HIejup115smDZmKeU
Jan 26 14:48:36 compute-1 systemd-logind[795]: New session 24 of user zuul.
Jan 26 14:48:36 compute-1 systemd[1]: Started Session 24 of User zuul.
Jan 26 14:48:36 compute-1 sshd-session[105480]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 14:48:37 compute-1 python3.9[105633]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 14:48:38 compute-1 sudo[105788]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vouwxyhvfcubjrsoifwjipysthyalmsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438918.0854464-44-56633052721733/AnsiballZ_command.py'
Jan 26 14:48:38 compute-1 sudo[105788]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:48:38 compute-1 python3.9[105790]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:48:38 compute-1 sudo[105788]: pam_unix(sudo:session): session closed for user root
Jan 26 14:48:39 compute-1 sudo[105953]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wakzuapppalkzoxngxbqxhfsynmycjhp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438919.2712083-66-276003356262383/AnsiballZ_systemd_service.py'
Jan 26 14:48:39 compute-1 sudo[105953]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:48:40 compute-1 python3.9[105955]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 14:48:40 compute-1 systemd[1]: Reloading.
Jan 26 14:48:40 compute-1 systemd-rc-local-generator[105977]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:48:40 compute-1 systemd-sysv-generator[105982]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 14:48:40 compute-1 sudo[105953]: pam_unix(sudo:session): session closed for user root
Jan 26 14:48:41 compute-1 python3.9[106139]: ansible-ansible.builtin.service_facts Invoked
Jan 26 14:48:41 compute-1 network[106156]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 26 14:48:41 compute-1 network[106157]: 'network-scripts' will be removed from distribution in near future.
Jan 26 14:48:41 compute-1 network[106158]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 26 14:48:45 compute-1 sudo[106417]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efqmprphclfrridafumwjjxjjqrzhnqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438925.5006075-104-50668158891454/AnsiballZ_systemd_service.py'
Jan 26 14:48:45 compute-1 sudo[106417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:48:46 compute-1 python3.9[106419]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 14:48:46 compute-1 sudo[106417]: pam_unix(sudo:session): session closed for user root
Jan 26 14:48:46 compute-1 sudo[106570]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-icbnfvyheuhsydkqxndzlwgyyrwmsrhz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438926.3162172-104-188141262473285/AnsiballZ_systemd_service.py'
Jan 26 14:48:46 compute-1 sudo[106570]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:48:46 compute-1 python3.9[106572]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 14:48:46 compute-1 sudo[106570]: pam_unix(sudo:session): session closed for user root
Jan 26 14:48:47 compute-1 sudo[106723]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gctsmtrggibowzrtrqeixsvivchcvgjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438927.126003-104-26204626935583/AnsiballZ_systemd_service.py'
Jan 26 14:48:47 compute-1 sudo[106723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:48:48 compute-1 python3.9[106725]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 14:48:48 compute-1 sudo[106723]: pam_unix(sudo:session): session closed for user root
Jan 26 14:48:48 compute-1 sudo[106876]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvssriukucnnjsyypkqzhfznfgtovgdc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438928.2628982-104-83614991908718/AnsiballZ_systemd_service.py'
Jan 26 14:48:48 compute-1 sudo[106876]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:48:48 compute-1 python3.9[106878]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 14:48:48 compute-1 sudo[106876]: pam_unix(sudo:session): session closed for user root
Jan 26 14:48:49 compute-1 sudo[107029]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thksznokqsjdcmcbdccyyvdpakcojlfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438929.1091492-104-266671493651312/AnsiballZ_systemd_service.py'
Jan 26 14:48:49 compute-1 sudo[107029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:48:49 compute-1 python3.9[107031]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 14:48:49 compute-1 sudo[107029]: pam_unix(sudo:session): session closed for user root
Jan 26 14:48:50 compute-1 sudo[107182]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mubttqqgnbbpomyadtikrjfioppbyqal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438929.9463985-104-160359548202791/AnsiballZ_systemd_service.py'
Jan 26 14:48:50 compute-1 sudo[107182]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:48:50 compute-1 python3.9[107184]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 14:48:50 compute-1 sudo[107182]: pam_unix(sudo:session): session closed for user root
Jan 26 14:48:51 compute-1 sshd-session[105479]: error: maximum authentication attempts exceeded for root from 185.246.128.170 port 28917 ssh2 [preauth]
Jan 26 14:48:51 compute-1 sshd-session[105479]: Disconnecting authenticating user root 185.246.128.170 port 28917: Too many authentication failures [preauth]
Jan 26 14:48:51 compute-1 sudo[107335]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nyesbbbovnkvtkixekdqekstxeqbvibc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438930.8043184-104-273796279881556/AnsiballZ_systemd_service.py'
Jan 26 14:48:51 compute-1 sudo[107335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:48:51 compute-1 python3.9[107337]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 14:48:51 compute-1 sudo[107335]: pam_unix(sudo:session): session closed for user root
Jan 26 14:48:52 compute-1 sudo[107488]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ybctltadswmiykocfbjicvvijttwuald ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438932.0465662-208-161758634092194/AnsiballZ_file.py'
Jan 26 14:48:52 compute-1 sudo[107488]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:48:52 compute-1 python3.9[107490]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:48:52 compute-1 sudo[107488]: pam_unix(sudo:session): session closed for user root
Jan 26 14:48:53 compute-1 sudo[107641]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-alokrsxfacttpbmzoazjiksgpqkdchuo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438933.0077977-208-7448907740729/AnsiballZ_file.py'
Jan 26 14:48:53 compute-1 sudo[107641]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:48:53 compute-1 python3.9[107643]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:48:53 compute-1 sudo[107641]: pam_unix(sudo:session): session closed for user root
Jan 26 14:48:53 compute-1 sudo[107794]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tknqpgephrzfwrkoompqhifngumtvioz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438933.682901-208-33353510768155/AnsiballZ_file.py'
Jan 26 14:48:53 compute-1 sudo[107794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:48:54 compute-1 python3.9[107796]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:48:54 compute-1 sudo[107794]: pam_unix(sudo:session): session closed for user root
Jan 26 14:48:54 compute-1 sudo[107946]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aprnjglauryspehhbewxypcjtutrpbnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438934.3009524-208-135965554588605/AnsiballZ_file.py'
Jan 26 14:48:54 compute-1 sudo[107946]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:48:54 compute-1 python3.9[107948]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:48:54 compute-1 sudo[107946]: pam_unix(sudo:session): session closed for user root
Jan 26 14:48:55 compute-1 sudo[108098]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-euwqzygtwdnmcvehqbsosbvzldmgyjgp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438934.9415436-208-273365803805781/AnsiballZ_file.py'
Jan 26 14:48:55 compute-1 sudo[108098]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:48:55 compute-1 python3.9[108100]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:48:55 compute-1 sudo[108098]: pam_unix(sudo:session): session closed for user root
Jan 26 14:48:55 compute-1 sudo[108250]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kiovzkqgpcoczjynperxlqipcvdspgyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438935.511289-208-121451995276195/AnsiballZ_file.py'
Jan 26 14:48:55 compute-1 sudo[108250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:48:55 compute-1 python3.9[108252]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:48:55 compute-1 sudo[108250]: pam_unix(sudo:session): session closed for user root
Jan 26 14:48:56 compute-1 sudo[108402]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxjaubwlzontrbpghmuyfacseuzmhwff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438936.0938737-208-265893033235013/AnsiballZ_file.py'
Jan 26 14:48:56 compute-1 sudo[108402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:48:56 compute-1 python3.9[108404]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:48:56 compute-1 sudo[108402]: pam_unix(sudo:session): session closed for user root
Jan 26 14:48:57 compute-1 sudo[108563]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-usmygytmwzsbxdnloeuethmygrggxtzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438936.732599-308-61609816068816/AnsiballZ_file.py'
Jan 26 14:48:57 compute-1 sudo[108563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:48:57 compute-1 podman[108528]: 2026-01-26 14:48:57.074910444 +0000 UTC m=+0.107349233 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20260120, maintainer=OpenStack Kubernetes Operator team)
Jan 26 14:48:57 compute-1 python3.9[108570]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:48:57 compute-1 sudo[108563]: pam_unix(sudo:session): session closed for user root
Jan 26 14:48:57 compute-1 sshd-session[107491]: error: maximum authentication attempts exceeded for root from 185.246.128.170 port 21863 ssh2 [preauth]
Jan 26 14:48:57 compute-1 sshd-session[107491]: Disconnecting authenticating user root 185.246.128.170 port 21863: Too many authentication failures [preauth]
Jan 26 14:48:57 compute-1 sudo[108744]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffrseixozfdfpojbsytilwmezyldvtfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438937.3308623-308-82902461005122/AnsiballZ_file.py'
Jan 26 14:48:57 compute-1 sudo[108744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:48:57 compute-1 podman[108705]: 2026-01-26 14:48:57.652487367 +0000 UTC m=+0.069184419 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Jan 26 14:48:57 compute-1 python3.9[108752]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:48:57 compute-1 sudo[108744]: pam_unix(sudo:session): session closed for user root
Jan 26 14:48:58 compute-1 sudo[108902]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwfghbayafjmeoomyvcvdtsspvzhvymg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438937.9848282-308-184851984093736/AnsiballZ_file.py'
Jan 26 14:48:58 compute-1 sudo[108902]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:48:58 compute-1 python3.9[108904]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:48:58 compute-1 sudo[108902]: pam_unix(sudo:session): session closed for user root
Jan 26 14:48:58 compute-1 sudo[109054]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sidclpvpgzxphxrllcckrplzynenpdue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438938.5723052-308-56608138346325/AnsiballZ_file.py'
Jan 26 14:48:58 compute-1 sudo[109054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:48:59 compute-1 python3.9[109056]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:48:59 compute-1 sudo[109054]: pam_unix(sudo:session): session closed for user root
Jan 26 14:48:59 compute-1 sudo[109206]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwewjquwwpyqqxnozhwxkptkcwbfaqgo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438939.2562778-308-264873312917133/AnsiballZ_file.py'
Jan 26 14:48:59 compute-1 sudo[109206]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:48:59 compute-1 python3.9[109208]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:48:59 compute-1 sudo[109206]: pam_unix(sudo:session): session closed for user root
Jan 26 14:49:00 compute-1 sudo[109359]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkzrtxtageqxiutwgrjgqvxrasohdzki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438939.9192798-308-13482620071910/AnsiballZ_file.py'
Jan 26 14:49:00 compute-1 sudo[109359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:49:00 compute-1 python3.9[109361]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:49:00 compute-1 sudo[109359]: pam_unix(sudo:session): session closed for user root
Jan 26 14:49:00 compute-1 sudo[109512]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwdoqgqkarhjgboifcgrbzwythjvnxwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438940.5419774-308-230662157850240/AnsiballZ_file.py'
Jan 26 14:49:00 compute-1 sudo[109512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:49:00 compute-1 python3.9[109514]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:49:01 compute-1 sudo[109512]: pam_unix(sudo:session): session closed for user root
Jan 26 14:49:01 compute-1 sudo[109664]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-diluwrxovgidhhmvuofqrdrfbscligic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438941.2844472-410-11898486332469/AnsiballZ_command.py'
Jan 26 14:49:01 compute-1 sudo[109664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:49:01 compute-1 python3.9[109666]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:49:01 compute-1 sudo[109664]: pam_unix(sudo:session): session closed for user root
Jan 26 14:49:02 compute-1 python3.9[109818]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 26 14:49:03 compute-1 sudo[109968]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qcnnsmpnslymwihssfoypyvmbcfjcxbb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438942.7636383-446-55399558025253/AnsiballZ_systemd_service.py'
Jan 26 14:49:03 compute-1 sudo[109968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:49:03 compute-1 python3.9[109970]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 14:49:03 compute-1 systemd[1]: Reloading.
Jan 26 14:49:03 compute-1 systemd-rc-local-generator[109997]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:49:03 compute-1 systemd-sysv-generator[110001]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 14:49:03 compute-1 sudo[109968]: pam_unix(sudo:session): session closed for user root
Jan 26 14:49:04 compute-1 sudo[110155]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfrcypmiraszvtoljuukxakzlngtxfsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438943.9355443-462-185884548233227/AnsiballZ_command.py'
Jan 26 14:49:04 compute-1 sudo[110155]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:49:04 compute-1 python3.9[110157]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:49:04 compute-1 sudo[110155]: pam_unix(sudo:session): session closed for user root
Jan 26 14:49:04 compute-1 sudo[110308]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-szfeepenfjsooqxzgxurzxinlipgrxnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438944.586976-462-121486241777427/AnsiballZ_command.py'
Jan 26 14:49:04 compute-1 sudo[110308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:49:05 compute-1 python3.9[110310]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:49:05 compute-1 sudo[110308]: pam_unix(sudo:session): session closed for user root
Jan 26 14:49:05 compute-1 sudo[110461]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xndjeqcbethqsiepmkdgqwdjvzoxjruj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438945.3572242-462-72041601651657/AnsiballZ_command.py'
Jan 26 14:49:05 compute-1 sudo[110461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:49:05 compute-1 python3.9[110463]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:49:05 compute-1 sudo[110461]: pam_unix(sudo:session): session closed for user root
Jan 26 14:49:06 compute-1 sudo[110614]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqcnovtutoadwdwoxyaefvwjsainvbsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438946.0573335-462-86640264429221/AnsiballZ_command.py'
Jan 26 14:49:06 compute-1 sudo[110614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:49:06 compute-1 python3.9[110616]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:49:06 compute-1 sudo[110614]: pam_unix(sudo:session): session closed for user root
Jan 26 14:49:06 compute-1 sudo[110767]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcxrlbtqijvbnloptcruzgexstkatutp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438946.6508672-462-99348415643670/AnsiballZ_command.py'
Jan 26 14:49:06 compute-1 sudo[110767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:49:07 compute-1 python3.9[110769]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:49:07 compute-1 sudo[110767]: pam_unix(sudo:session): session closed for user root
Jan 26 14:49:07 compute-1 sudo[110920]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jairryflpsbxtbhmvbbtiquzxgnldjes ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438947.310521-462-111898706920277/AnsiballZ_command.py'
Jan 26 14:49:07 compute-1 sudo[110920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:49:07 compute-1 python3.9[110922]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:49:07 compute-1 sudo[110920]: pam_unix(sudo:session): session closed for user root
Jan 26 14:49:08 compute-1 sudo[111073]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdlvilcaitseovdkdkbcqmrhghznlkag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438947.9706187-462-214624327591086/AnsiballZ_command.py'
Jan 26 14:49:08 compute-1 sudo[111073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:49:08 compute-1 python3.9[111075]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:49:08 compute-1 sudo[111073]: pam_unix(sudo:session): session closed for user root
Jan 26 14:49:09 compute-1 sudo[111226]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrpauzbgdnmhdwcyvsdznqkvjqdvthlm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438949.0127454-570-99927371935480/AnsiballZ_getent.py'
Jan 26 14:49:09 compute-1 sudo[111226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:49:09 compute-1 python3.9[111228]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Jan 26 14:49:09 compute-1 sudo[111226]: pam_unix(sudo:session): session closed for user root
Jan 26 14:49:10 compute-1 sudo[111379]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ziawkwvamorvtythjvfwhuakcqssorae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438949.922257-586-62729678514942/AnsiballZ_group.py'
Jan 26 14:49:10 compute-1 sudo[111379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:49:10 compute-1 python3.9[111381]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 26 14:49:10 compute-1 groupadd[111382]: group added to /etc/group: name=libvirt, GID=42473
Jan 26 14:49:10 compute-1 groupadd[111382]: group added to /etc/gshadow: name=libvirt
Jan 26 14:49:10 compute-1 groupadd[111382]: new group: name=libvirt, GID=42473
Jan 26 14:49:10 compute-1 sudo[111379]: pam_unix(sudo:session): session closed for user root
Jan 26 14:49:11 compute-1 sudo[111537]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lsyisxavyvikjbxgkfjhwkdiqpflaben ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438950.922769-602-130959215959365/AnsiballZ_user.py'
Jan 26 14:49:11 compute-1 sudo[111537]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:49:11 compute-1 python3.9[111539]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 26 14:49:12 compute-1 useradd[111541]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/0
Jan 26 14:49:12 compute-1 sudo[111537]: pam_unix(sudo:session): session closed for user root
Jan 26 14:49:13 compute-1 sudo[111697]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdszqidthowqalvhbfnjrabtqbqihtaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438952.748894-624-36434764338282/AnsiballZ_setup.py'
Jan 26 14:49:13 compute-1 sudo[111697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:49:13 compute-1 python3.9[111699]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 14:49:13 compute-1 sudo[111697]: pam_unix(sudo:session): session closed for user root
Jan 26 14:49:14 compute-1 sudo[111781]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmhamwwjsppdvdfrnsfutvxeliolnscl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769438952.748894-624-36434764338282/AnsiballZ_dnf.py'
Jan 26 14:49:14 compute-1 sudo[111781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:49:14 compute-1 python3.9[111783]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 14:49:17 compute-1 sshd-session[109209]: error: maximum authentication attempts exceeded for root from 185.246.128.170 port 1095 ssh2 [preauth]
Jan 26 14:49:17 compute-1 sshd-session[109209]: Disconnecting authenticating user root 185.246.128.170 port 1095: Too many authentication failures [preauth]
Jan 26 14:49:27 compute-1 sshd-session[111794]: Disconnecting authenticating user root 185.246.128.170 port 56262: Change of username or service not allowed: (root,ssh-connection) -> (minecraft,ssh-connection) [preauth]
Jan 26 14:49:27 compute-1 podman[111969]: 2026-01-26 14:49:27.89572138 +0000 UTC m=+0.072309345 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4)
Jan 26 14:49:27 compute-1 podman[111968]: 2026-01-26 14:49:27.953271249 +0000 UTC m=+0.129192176 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 14:49:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:49:28.988 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 14:49:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:49:28.989 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 14:49:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:49:28.989 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 14:49:37 compute-1 sshd-session[112013]: Invalid user minecraft from 185.246.128.170 port 20595
Jan 26 14:49:38 compute-1 sshd-session[112013]: Disconnecting invalid user minecraft 185.246.128.170 port 20595: Change of username or service not allowed: (minecraft,ssh-connection) -> (john,ssh-connection) [preauth]
Jan 26 14:49:46 compute-1 sshd-session[112021]: Invalid user john from 185.246.128.170 port 34538
Jan 26 14:49:46 compute-1 sshd-session[112021]: Disconnecting invalid user john 185.246.128.170 port 34538: Change of username or service not allowed: (john,ssh-connection) -> (onlime_r,ssh-connection) [preauth]
Jan 26 14:49:49 compute-1 kernel: SELinux:  Converting 2764 SID table entries...
Jan 26 14:49:49 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Jan 26 14:49:49 compute-1 kernel: SELinux:  policy capability open_perms=1
Jan 26 14:49:49 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Jan 26 14:49:49 compute-1 kernel: SELinux:  policy capability always_check_network=0
Jan 26 14:49:49 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 26 14:49:49 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 26 14:49:49 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 26 14:49:55 compute-1 sshd-session[112023]: Invalid user onlime_r from 185.246.128.170 port 7233
Jan 26 14:49:57 compute-1 sshd-session[112023]: Disconnecting invalid user onlime_r 185.246.128.170 port 7233: Change of username or service not allowed: (onlime_r,ssh-connection) -> (weewx,ssh-connection) [preauth]
Jan 26 14:49:58 compute-1 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Jan 26 14:49:58 compute-1 podman[112039]: 2026-01-26 14:49:58.900184545 +0000 UTC m=+0.062489142 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Jan 26 14:49:58 compute-1 podman[112038]: 2026-01-26 14:49:58.933507237 +0000 UTC m=+0.097575291 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller)
Jan 26 14:49:59 compute-1 kernel: SELinux:  Converting 2764 SID table entries...
Jan 26 14:49:59 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Jan 26 14:49:59 compute-1 kernel: SELinux:  policy capability open_perms=1
Jan 26 14:49:59 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Jan 26 14:49:59 compute-1 kernel: SELinux:  policy capability always_check_network=0
Jan 26 14:49:59 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 26 14:49:59 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 26 14:49:59 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 26 14:50:10 compute-1 sshd-session[112088]: Invalid user weewx from 185.246.128.170 port 59669
Jan 26 14:50:20 compute-1 sshd-session[112088]: Disconnecting invalid user weewx 185.246.128.170 port 59669: Change of username or service not allowed: (weewx,ssh-connection) -> (xiaoxiao,ssh-connection) [preauth]
Jan 26 14:50:26 compute-1 sshd-session[118847]: Invalid user xiaoxiao from 185.246.128.170 port 13104
Jan 26 14:50:27 compute-1 sshd-session[118847]: Disconnecting invalid user xiaoxiao 185.246.128.170 port 13104: Change of username or service not allowed: (xiaoxiao,ssh-connection) -> (nexus,ssh-connection) [preauth]
Jan 26 14:50:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:50:28.990 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 14:50:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:50:28.991 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 14:50:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:50:28.991 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 14:50:29 compute-1 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Jan 26 14:50:29 compute-1 podman[123457]: 2026-01-26 14:50:29.887790562 +0000 UTC m=+0.059510074 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4)
Jan 26 14:50:29 compute-1 podman[123446]: 2026-01-26 14:50:29.917382251 +0000 UTC m=+0.089607346 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260120)
Jan 26 14:50:32 compute-1 sshd-session[124257]: Invalid user nexus from 185.246.128.170 port 9089
Jan 26 14:50:38 compute-1 sshd-session[128732]: Connection closed by 45.148.10.121 port 36810 [preauth]
Jan 26 14:50:41 compute-1 sshd-session[124257]: Disconnecting invalid user nexus 185.246.128.170 port 9089: Change of username or service not allowed: (nexus,ssh-connection) -> (netlink,ssh-connection) [preauth]
Jan 26 14:50:52 compute-1 kernel: SELinux:  Converting 2765 SID table entries...
Jan 26 14:50:52 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Jan 26 14:50:52 compute-1 kernel: SELinux:  policy capability open_perms=1
Jan 26 14:50:52 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Jan 26 14:50:52 compute-1 kernel: SELinux:  policy capability always_check_network=0
Jan 26 14:50:52 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 26 14:50:52 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 26 14:50:52 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 26 14:50:53 compute-1 groupadd[129021]: group added to /etc/group: name=dnsmasq, GID=993
Jan 26 14:50:53 compute-1 groupadd[129021]: group added to /etc/gshadow: name=dnsmasq
Jan 26 14:50:53 compute-1 groupadd[129021]: new group: name=dnsmasq, GID=993
Jan 26 14:50:53 compute-1 useradd[129028]: new user: name=dnsmasq, UID=992, GID=993, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Jan 26 14:50:53 compute-1 dbus-broker-launch[756]: Noticed file-system modification, trigger reload.
Jan 26 14:50:53 compute-1 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Jan 26 14:50:54 compute-1 dbus-broker-launch[756]: Noticed file-system modification, trigger reload.
Jan 26 14:50:54 compute-1 sshd-session[129007]: Invalid user netlink from 185.246.128.170 port 23571
Jan 26 14:50:54 compute-1 groupadd[129041]: group added to /etc/group: name=clevis, GID=992
Jan 26 14:50:54 compute-1 groupadd[129041]: group added to /etc/gshadow: name=clevis
Jan 26 14:50:54 compute-1 groupadd[129041]: new group: name=clevis, GID=992
Jan 26 14:50:54 compute-1 useradd[129048]: new user: name=clevis, UID=991, GID=992, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Jan 26 14:50:54 compute-1 sshd-session[129007]: Disconnecting invalid user netlink 185.246.128.170 port 23571: Change of username or service not allowed: (netlink,ssh-connection) -> (ubuntu,ssh-connection) [preauth]
Jan 26 14:50:55 compute-1 usermod[129058]: add 'clevis' to group 'tss'
Jan 26 14:50:55 compute-1 usermod[129058]: add 'clevis' to shadow group 'tss'
Jan 26 14:50:57 compute-1 polkitd[43743]: Reloading rules
Jan 26 14:50:57 compute-1 polkitd[43743]: Collecting garbage unconditionally...
Jan 26 14:50:57 compute-1 polkitd[43743]: Loading rules from directory /etc/polkit-1/rules.d
Jan 26 14:50:57 compute-1 polkitd[43743]: Loading rules from directory /usr/share/polkit-1/rules.d
Jan 26 14:50:57 compute-1 polkitd[43743]: Finished loading, compiling and executing 3 rules
Jan 26 14:50:57 compute-1 polkitd[43743]: Reloading rules
Jan 26 14:50:57 compute-1 polkitd[43743]: Collecting garbage unconditionally...
Jan 26 14:50:57 compute-1 polkitd[43743]: Loading rules from directory /etc/polkit-1/rules.d
Jan 26 14:50:57 compute-1 polkitd[43743]: Loading rules from directory /usr/share/polkit-1/rules.d
Jan 26 14:50:57 compute-1 polkitd[43743]: Finished loading, compiling and executing 3 rules
Jan 26 14:50:58 compute-1 groupadd[129248]: group added to /etc/group: name=ceph, GID=167
Jan 26 14:50:58 compute-1 groupadd[129248]: group added to /etc/gshadow: name=ceph
Jan 26 14:50:58 compute-1 groupadd[129248]: new group: name=ceph, GID=167
Jan 26 14:50:58 compute-1 useradd[129254]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Jan 26 14:51:00 compute-1 podman[129262]: 2026-01-26 14:51:00.936702479 +0000 UTC m=+0.089166955 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Jan 26 14:51:00 compute-1 podman[129261]: 2026-01-26 14:51:00.984307872 +0000 UTC m=+0.137450234 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Jan 26 14:51:02 compute-1 systemd[1]: Stopping OpenSSH server daemon...
Jan 26 14:51:02 compute-1 sshd[1009]: Received signal 15; terminating.
Jan 26 14:51:02 compute-1 systemd[1]: sshd.service: Deactivated successfully.
Jan 26 14:51:02 compute-1 systemd[1]: sshd.service: Unit process 129291 (sshd-session) remains running after unit stopped.
Jan 26 14:51:02 compute-1 systemd[1]: Stopped OpenSSH server daemon.
Jan 26 14:51:02 compute-1 systemd[1]: sshd.service: Consumed 4.644s CPU time, 13.0M memory peak, read 32.0K from disk, written 284.0K to disk.
Jan 26 14:51:02 compute-1 systemd[1]: Stopped target sshd-keygen.target.
Jan 26 14:51:02 compute-1 systemd[1]: Stopping sshd-keygen.target...
Jan 26 14:51:02 compute-1 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 14:51:02 compute-1 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 26 14:51:02 compute-1 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 26 14:51:02 compute-1 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 26 14:51:02 compute-1 systemd[1]: Reached target sshd-keygen.target.
Jan 26 14:51:03 compute-1 systemd[1]: Starting OpenSSH server daemon...
Jan 26 14:51:03 compute-1 sshd[129819]: Server listening on 0.0.0.0 port 22.
Jan 26 14:51:03 compute-1 sshd[129819]: Server listening on :: port 22.
Jan 26 14:51:03 compute-1 systemd[1]: Started OpenSSH server daemon.
Jan 26 14:51:05 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 26 14:51:05 compute-1 systemd[1]: Starting man-db-cache-update.service...
Jan 26 14:51:05 compute-1 systemd[1]: Reloading.
Jan 26 14:51:05 compute-1 systemd-sysv-generator[130083]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 14:51:05 compute-1 systemd-rc-local-generator[130080]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:51:06 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 26 14:51:11 compute-1 sudo[111781]: pam_unix(sudo:session): session closed for user root
Jan 26 14:51:12 compute-1 sudo[134276]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gistpckencwkrkwyypffrsitfftgydek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439071.9781268-648-45792401602911/AnsiballZ_systemd.py'
Jan 26 14:51:12 compute-1 sudo[134276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:51:13 compute-1 python3.9[134298]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 26 14:51:13 compute-1 systemd[1]: Reloading.
Jan 26 14:51:13 compute-1 systemd-rc-local-generator[134784]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:51:13 compute-1 systemd-sysv-generator[134787]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 14:51:13 compute-1 sudo[134276]: pam_unix(sudo:session): session closed for user root
Jan 26 14:51:13 compute-1 sudo[135585]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvvexcqthlxvnjstllwqlqhbibkkjrjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439073.5172029-648-233446929048978/AnsiballZ_systemd.py'
Jan 26 14:51:13 compute-1 sudo[135585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:51:14 compute-1 python3.9[135603]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 26 14:51:14 compute-1 systemd[1]: Reloading.
Jan 26 14:51:14 compute-1 systemd-rc-local-generator[136123]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:51:14 compute-1 systemd-sysv-generator[136129]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 14:51:14 compute-1 sudo[135585]: pam_unix(sudo:session): session closed for user root
Jan 26 14:51:14 compute-1 sudo[136622]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbwgecwyigadsrqvwoeucffcqqbtvosj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439074.6689785-648-128815451066505/AnsiballZ_systemd.py'
Jan 26 14:51:14 compute-1 sudo[136622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:51:15 compute-1 python3.9[136641]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 26 14:51:15 compute-1 systemd[1]: Reloading.
Jan 26 14:51:15 compute-1 systemd-rc-local-generator[137057]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:51:15 compute-1 systemd-sysv-generator[137064]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 14:51:15 compute-1 sshd-session[129291]: Invalid user ubuntu from 185.246.128.170 port 58686
Jan 26 14:51:15 compute-1 sudo[136622]: pam_unix(sudo:session): session closed for user root
Jan 26 14:51:16 compute-1 sudo[138102]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbfajemfcrmljibscxhclpotnllsfton ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439075.9123147-648-26201906093440/AnsiballZ_systemd.py'
Jan 26 14:51:16 compute-1 sudo[138102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:51:16 compute-1 python3.9[138126]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 26 14:51:16 compute-1 systemd[1]: Reloading.
Jan 26 14:51:16 compute-1 systemd-rc-local-generator[138392]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:51:16 compute-1 systemd-sysv-generator[138395]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 14:51:16 compute-1 sudo[138102]: pam_unix(sudo:session): session closed for user root
Jan 26 14:51:18 compute-1 sudo[139247]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptjvfvreuicfrljykxzhybioqqhqjedf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439078.2098017-706-167908268946656/AnsiballZ_systemd.py'
Jan 26 14:51:18 compute-1 sudo[139247]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:51:18 compute-1 python3.9[139249]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 14:51:18 compute-1 systemd[1]: Reloading.
Jan 26 14:51:18 compute-1 systemd-rc-local-generator[139396]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:51:18 compute-1 systemd-sysv-generator[139400]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 14:51:19 compute-1 sudo[139247]: pam_unix(sudo:session): session closed for user root
Jan 26 14:51:19 compute-1 sudo[139554]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bstsmxzjormyeswfovdklbntzjeoqihh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439079.266666-706-148914851766870/AnsiballZ_systemd.py'
Jan 26 14:51:19 compute-1 sudo[139554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:51:19 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 26 14:51:19 compute-1 systemd[1]: Finished man-db-cache-update.service.
Jan 26 14:51:19 compute-1 systemd[1]: man-db-cache-update.service: Consumed 10.343s CPU time.
Jan 26 14:51:19 compute-1 systemd[1]: run-r4bd86adcbccb4d2d881762d88e1b770f.service: Deactivated successfully.
Jan 26 14:51:19 compute-1 python3.9[139556]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 14:51:19 compute-1 systemd[1]: Reloading.
Jan 26 14:51:20 compute-1 systemd-sysv-generator[139590]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 14:51:20 compute-1 systemd-rc-local-generator[139587]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:51:20 compute-1 sshd-session[129291]: Disconnecting invalid user ubuntu 185.246.128.170 port 58686: Change of username or service not allowed: (ubuntu,ssh-connection) -> (odoo18,ssh-connection) [preauth]
Jan 26 14:51:20 compute-1 sudo[139554]: pam_unix(sudo:session): session closed for user root
Jan 26 14:51:20 compute-1 sudo[139745]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rckdoczznrllhmusmoenfnladfxegwrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439080.6734025-706-13411823812504/AnsiballZ_systemd.py'
Jan 26 14:51:20 compute-1 sudo[139745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:51:21 compute-1 python3.9[139747]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 14:51:21 compute-1 systemd[1]: Reloading.
Jan 26 14:51:21 compute-1 systemd-rc-local-generator[139777]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:51:21 compute-1 systemd-sysv-generator[139780]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 14:51:21 compute-1 sudo[139745]: pam_unix(sudo:session): session closed for user root
Jan 26 14:51:22 compute-1 sudo[139935]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yilvejwuiqasnglblmpjaurqtlcpkegr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439081.8957283-706-127938220807859/AnsiballZ_systemd.py'
Jan 26 14:51:22 compute-1 sudo[139935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:51:22 compute-1 python3.9[139937]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 14:51:22 compute-1 sudo[139935]: pam_unix(sudo:session): session closed for user root
Jan 26 14:51:22 compute-1 sudo[140091]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afpmaxvnbpuusjdbxwmmqjkftqyeymkk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439082.6967382-706-140148619648566/AnsiballZ_systemd.py'
Jan 26 14:51:22 compute-1 sudo[140091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:51:23 compute-1 python3.9[140093]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 14:51:23 compute-1 systemd[1]: Reloading.
Jan 26 14:51:23 compute-1 systemd-sysv-generator[140128]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 14:51:23 compute-1 systemd-rc-local-generator[140124]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:51:23 compute-1 sudo[140091]: pam_unix(sudo:session): session closed for user root
Jan 26 14:51:25 compute-1 sudo[140283]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ratzmtuhbgyaezpbpkpzohbxkcirmnht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439084.7779884-778-154362763724383/AnsiballZ_systemd.py'
Jan 26 14:51:25 compute-1 sudo[140283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:51:25 compute-1 python3.9[140285]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 26 14:51:25 compute-1 systemd[1]: Reloading.
Jan 26 14:51:25 compute-1 systemd-sysv-generator[140322]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 14:51:25 compute-1 systemd-rc-local-generator[140319]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:51:25 compute-1 systemd[1]: Listening on libvirt proxy daemon socket.
Jan 26 14:51:25 compute-1 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Jan 26 14:51:25 compute-1 sudo[140283]: pam_unix(sudo:session): session closed for user root
Jan 26 14:51:26 compute-1 sudo[140476]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsklrapwvilysordqwxohltxylqzoggl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439085.9904466-794-61814789565349/AnsiballZ_systemd.py'
Jan 26 14:51:26 compute-1 sudo[140476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:51:26 compute-1 python3.9[140478]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 14:51:26 compute-1 sudo[140476]: pam_unix(sudo:session): session closed for user root
Jan 26 14:51:27 compute-1 sshd-session[139965]: Invalid user odoo18 from 185.246.128.170 port 40113
Jan 26 14:51:27 compute-1 sudo[140631]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yuzgpxfendvipzubgydijosmucljcnde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439086.831219-794-41765383251814/AnsiballZ_systemd.py'
Jan 26 14:51:27 compute-1 sudo[140631]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:51:27 compute-1 python3.9[140633]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 14:51:27 compute-1 sudo[140631]: pam_unix(sudo:session): session closed for user root
Jan 26 14:51:27 compute-1 sudo[140786]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-neznfllmrsqnivapuvsyllbdotctdtrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439087.6315725-794-269074736163489/AnsiballZ_systemd.py'
Jan 26 14:51:27 compute-1 sudo[140786]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:51:28 compute-1 python3.9[140788]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 14:51:28 compute-1 sudo[140786]: pam_unix(sudo:session): session closed for user root
Jan 26 14:51:28 compute-1 sudo[140941]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dueipllsonnezigosmuldtrttnxbdagx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439088.4930022-794-163222239219756/AnsiballZ_systemd.py'
Jan 26 14:51:28 compute-1 sudo[140941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:51:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:51:28.993 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 14:51:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:51:28.994 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 14:51:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:51:28.994 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 14:51:29 compute-1 python3.9[140943]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 14:51:29 compute-1 sudo[140941]: pam_unix(sudo:session): session closed for user root
Jan 26 14:51:29 compute-1 sudo[141097]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lutwhsjfrbxxlodpirtctsmcttxhdgfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439089.2645907-794-189686830906397/AnsiballZ_systemd.py'
Jan 26 14:51:29 compute-1 sudo[141097]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:51:29 compute-1 python3.9[141099]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 14:51:29 compute-1 sudo[141097]: pam_unix(sudo:session): session closed for user root
Jan 26 14:51:30 compute-1 sudo[141252]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iamjkdqxsqsjizmdzspipmmwmhejbzos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439090.0675554-794-152658480122187/AnsiballZ_systemd.py'
Jan 26 14:51:30 compute-1 sudo[141252]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:51:30 compute-1 python3.9[141254]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 14:51:30 compute-1 sudo[141252]: pam_unix(sudo:session): session closed for user root
Jan 26 14:51:31 compute-1 sudo[141437]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlflqdeqpktegfwtektppsojsexzccmc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439090.957042-794-19806607653317/AnsiballZ_systemd.py'
Jan 26 14:51:31 compute-1 sudo[141437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:51:31 compute-1 podman[141382]: 2026-01-26 14:51:31.293582346 +0000 UTC m=+0.090045929 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 14:51:31 compute-1 podman[141381]: 2026-01-26 14:51:31.302516356 +0000 UTC m=+0.098882406 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 26 14:51:31 compute-1 python3.9[141445]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 14:51:31 compute-1 sudo[141437]: pam_unix(sudo:session): session closed for user root
Jan 26 14:51:31 compute-1 sshd-session[139965]: Disconnecting invalid user odoo18 185.246.128.170 port 40113: Change of username or service not allowed: (odoo18,ssh-connection) -> (Admin,ssh-connection) [preauth]
Jan 26 14:51:32 compute-1 sudo[141603]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvlhbnsshggbllgxfoqzmkfdmxzlvtkk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439091.7845426-794-98852420877466/AnsiballZ_systemd.py'
Jan 26 14:51:32 compute-1 sudo[141603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:51:32 compute-1 python3.9[141605]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 14:51:32 compute-1 sudo[141603]: pam_unix(sudo:session): session closed for user root
Jan 26 14:51:32 compute-1 sudo[141758]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eycylbtncgkfkwwkpkvgxkfeqvolyvdh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439092.611397-794-48703399545317/AnsiballZ_systemd.py'
Jan 26 14:51:32 compute-1 sudo[141758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:51:33 compute-1 python3.9[141760]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 14:51:33 compute-1 sudo[141758]: pam_unix(sudo:session): session closed for user root
Jan 26 14:51:33 compute-1 sudo[141913]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lnogwmwgzdliznuwkaxpzzxoaygyhegq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439093.3994274-794-138360033847211/AnsiballZ_systemd.py'
Jan 26 14:51:33 compute-1 sudo[141913]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:51:33 compute-1 python3.9[141915]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 14:51:34 compute-1 sudo[141913]: pam_unix(sudo:session): session closed for user root
Jan 26 14:51:34 compute-1 sudo[142069]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-syyibyhhqdvprsjpsekupccdewkcicay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439094.167816-794-170653482912279/AnsiballZ_systemd.py'
Jan 26 14:51:34 compute-1 sudo[142069]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:51:34 compute-1 python3.9[142071]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 14:51:34 compute-1 sudo[142069]: pam_unix(sudo:session): session closed for user root
Jan 26 14:51:35 compute-1 sudo[142225]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkcwdwnzgkfmhlwfgklmicgcbwkqpyli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439094.9946847-794-56613616893596/AnsiballZ_systemd.py'
Jan 26 14:51:35 compute-1 sudo[142225]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:51:35 compute-1 python3.9[142227]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 14:51:35 compute-1 sudo[142225]: pam_unix(sudo:session): session closed for user root
Jan 26 14:51:36 compute-1 sudo[142380]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ppsjpfojumtkamywjnlfclwmneehwvkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439095.7736013-794-214119674370849/AnsiballZ_systemd.py'
Jan 26 14:51:36 compute-1 sudo[142380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:51:36 compute-1 python3.9[142382]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 14:51:36 compute-1 sudo[142380]: pam_unix(sudo:session): session closed for user root
Jan 26 14:51:36 compute-1 sshd-session[141919]: Invalid user Admin from 185.246.128.170 port 43166
Jan 26 14:51:36 compute-1 sudo[142535]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fgmrxibfrubghwligonkrtsmiqcktyub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439096.5361662-794-6701897145692/AnsiballZ_systemd.py'
Jan 26 14:51:36 compute-1 sudo[142535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:51:37 compute-1 python3.9[142537]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 14:51:37 compute-1 sudo[142535]: pam_unix(sudo:session): session closed for user root
Jan 26 14:51:37 compute-1 sudo[142690]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnwtmoelsohsidijzqgawngcnrgivfnt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439097.686208-998-103199626557000/AnsiballZ_file.py'
Jan 26 14:51:37 compute-1 sudo[142690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:51:38 compute-1 python3.9[142692]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:51:38 compute-1 sudo[142690]: pam_unix(sudo:session): session closed for user root
Jan 26 14:51:38 compute-1 sudo[142842]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njjphpvuswrmgttvznpompnunfvnrltg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439098.309328-998-21743554135274/AnsiballZ_file.py'
Jan 26 14:51:38 compute-1 sudo[142842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:51:38 compute-1 python3.9[142844]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:51:38 compute-1 sudo[142842]: pam_unix(sudo:session): session closed for user root
Jan 26 14:51:39 compute-1 sudo[142994]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qagrcnhlhqfehzolspjnytcfngroiqlh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439098.9509676-998-32878204404088/AnsiballZ_file.py'
Jan 26 14:51:39 compute-1 sudo[142994]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:51:39 compute-1 python3.9[142996]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:51:39 compute-1 sudo[142994]: pam_unix(sudo:session): session closed for user root
Jan 26 14:51:39 compute-1 sudo[143146]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xcpyuqzotxyltaqkteltydfgttxihagz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439099.5498161-998-130395617688830/AnsiballZ_file.py'
Jan 26 14:51:39 compute-1 sudo[143146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:51:40 compute-1 python3.9[143148]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:51:40 compute-1 sudo[143146]: pam_unix(sudo:session): session closed for user root
Jan 26 14:51:40 compute-1 sudo[143298]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jadkssfwyjcgleikxsbtuchmffqvbvtw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439100.1503105-998-166117953015782/AnsiballZ_file.py'
Jan 26 14:51:40 compute-1 sudo[143298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:51:40 compute-1 python3.9[143300]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:51:40 compute-1 sudo[143298]: pam_unix(sudo:session): session closed for user root
Jan 26 14:51:41 compute-1 sudo[143450]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-acdhrvptmbzqtrgwxexraeibttthamdy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439100.768148-998-230296966837527/AnsiballZ_file.py'
Jan 26 14:51:41 compute-1 sudo[143450]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:51:41 compute-1 python3.9[143452]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:51:41 compute-1 sudo[143450]: pam_unix(sudo:session): session closed for user root
Jan 26 14:51:41 compute-1 sshd-session[141919]: Disconnecting invalid user Admin 185.246.128.170 port 43166: Change of username or service not allowed: (Admin,ssh-connection) -> (azure,ssh-connection) [preauth]
Jan 26 14:51:42 compute-1 python3.9[143602]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 14:51:42 compute-1 sudo[143753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmqsygdlwlnvbzsmzkeientukqzrjomt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439102.2696438-1100-72750727232096/AnsiballZ_stat.py'
Jan 26 14:51:42 compute-1 sudo[143753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:51:42 compute-1 python3.9[143755]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:51:42 compute-1 sudo[143753]: pam_unix(sudo:session): session closed for user root
Jan 26 14:51:43 compute-1 sudo[143878]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-klfyqxjjtvjzdlarfcblwimjstpqlplq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439102.2696438-1100-72750727232096/AnsiballZ_copy.py'
Jan 26 14:51:43 compute-1 sudo[143878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:51:43 compute-1 python3.9[143880]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769439102.2696438-1100-72750727232096/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:51:43 compute-1 sudo[143878]: pam_unix(sudo:session): session closed for user root
Jan 26 14:51:44 compute-1 sudo[144030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugxdikcttdwomqgewltjnzzqxjgzfhjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439103.7646506-1100-260445705509929/AnsiballZ_stat.py'
Jan 26 14:51:44 compute-1 sudo[144030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:51:44 compute-1 python3.9[144032]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:51:44 compute-1 sudo[144030]: pam_unix(sudo:session): session closed for user root
Jan 26 14:51:44 compute-1 sudo[144156]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cadbfrymmllayinxqoodnhfpyvvtomue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439103.7646506-1100-260445705509929/AnsiballZ_copy.py'
Jan 26 14:51:44 compute-1 sudo[144156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:51:44 compute-1 python3.9[144158]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769439103.7646506-1100-260445705509929/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:51:44 compute-1 sudo[144156]: pam_unix(sudo:session): session closed for user root
Jan 26 14:51:45 compute-1 sudo[144308]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahlepixfhstekujthuxetjcgexudntxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439104.9586947-1100-224903744867965/AnsiballZ_stat.py'
Jan 26 14:51:45 compute-1 sudo[144308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:51:45 compute-1 python3.9[144310]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:51:45 compute-1 sudo[144308]: pam_unix(sudo:session): session closed for user root
Jan 26 14:51:45 compute-1 sudo[144433]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkhevdsbfnzmmbmshxlwjrlsezefdsuw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439104.9586947-1100-224903744867965/AnsiballZ_copy.py'
Jan 26 14:51:45 compute-1 sudo[144433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:51:45 compute-1 python3.9[144435]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769439104.9586947-1100-224903744867965/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:51:46 compute-1 sudo[144433]: pam_unix(sudo:session): session closed for user root
Jan 26 14:51:46 compute-1 sudo[144585]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oulnqvnsowytsbwhpgubfriearxchpmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439106.1346145-1100-180393012074175/AnsiballZ_stat.py'
Jan 26 14:51:46 compute-1 sudo[144585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:51:46 compute-1 python3.9[144587]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:51:46 compute-1 sudo[144585]: pam_unix(sudo:session): session closed for user root
Jan 26 14:51:46 compute-1 sudo[144710]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbhznvtluwbirkxsvdybulyqdkfnwlaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439106.1346145-1100-180393012074175/AnsiballZ_copy.py'
Jan 26 14:51:46 compute-1 sudo[144710]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:51:47 compute-1 python3.9[144712]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769439106.1346145-1100-180393012074175/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:51:47 compute-1 sudo[144710]: pam_unix(sudo:session): session closed for user root
Jan 26 14:51:47 compute-1 sudo[144862]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oeihhshzdrgydxwcflcwoysqgzcmjslr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439107.2982745-1100-84702002642336/AnsiballZ_stat.py'
Jan 26 14:51:47 compute-1 sudo[144862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:51:47 compute-1 python3.9[144864]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:51:47 compute-1 sudo[144862]: pam_unix(sudo:session): session closed for user root
Jan 26 14:51:48 compute-1 sudo[144987]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntngzajxioqamutswmrnvopllyrsfpkk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439107.2982745-1100-84702002642336/AnsiballZ_copy.py'
Jan 26 14:51:48 compute-1 sudo[144987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:51:48 compute-1 python3.9[144989]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769439107.2982745-1100-84702002642336/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:51:48 compute-1 sudo[144987]: pam_unix(sudo:session): session closed for user root
Jan 26 14:51:48 compute-1 sudo[145139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvbzylxkbecttsqxfjicydlfhjuupwlg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439108.4722438-1100-48578713064918/AnsiballZ_stat.py'
Jan 26 14:51:48 compute-1 sudo[145139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:51:48 compute-1 python3.9[145141]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:51:48 compute-1 sudo[145139]: pam_unix(sudo:session): session closed for user root
Jan 26 14:51:49 compute-1 sudo[145264]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hanhojchaflbmjsfrbkskdwfljgehoss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439108.4722438-1100-48578713064918/AnsiballZ_copy.py'
Jan 26 14:51:49 compute-1 sudo[145264]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:51:49 compute-1 python3.9[145266]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769439108.4722438-1100-48578713064918/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:51:49 compute-1 sudo[145264]: pam_unix(sudo:session): session closed for user root
Jan 26 14:51:49 compute-1 sudo[145416]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htoagweynxcmgimdlscqkglmyynqxgab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439109.6974454-1100-50541315237961/AnsiballZ_stat.py'
Jan 26 14:51:49 compute-1 sudo[145416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:51:50 compute-1 python3.9[145418]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:51:50 compute-1 sudo[145416]: pam_unix(sudo:session): session closed for user root
Jan 26 14:51:50 compute-1 sudo[145539]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrhazljzyjzpjhpgsqnccyjuvuizowle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439109.6974454-1100-50541315237961/AnsiballZ_copy.py'
Jan 26 14:51:50 compute-1 sudo[145539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:51:50 compute-1 python3.9[145541]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769439109.6974454-1100-50541315237961/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:51:50 compute-1 sudo[145539]: pam_unix(sudo:session): session closed for user root
Jan 26 14:51:51 compute-1 sudo[145691]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhcptdliselbaogtklocmfiqknrtqcxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439110.8027666-1100-232448193228263/AnsiballZ_stat.py'
Jan 26 14:51:51 compute-1 sudo[145691]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:51:51 compute-1 python3.9[145693]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:51:51 compute-1 sudo[145691]: pam_unix(sudo:session): session closed for user root
Jan 26 14:51:51 compute-1 sudo[145816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uopolojvnyrmnmnxtkhhyfrccmxqsrnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439110.8027666-1100-232448193228263/AnsiballZ_copy.py'
Jan 26 14:51:51 compute-1 sudo[145816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:51:51 compute-1 python3.9[145818]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769439110.8027666-1100-232448193228263/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:51:51 compute-1 sudo[145816]: pam_unix(sudo:session): session closed for user root
Jan 26 14:51:52 compute-1 sudo[145968]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wprrjcpzjrcvvqbpidscxtrvfafmypie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439112.18821-1326-44202404075395/AnsiballZ_command.py'
Jan 26 14:51:52 compute-1 sudo[145968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:51:52 compute-1 python3.9[145970]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Jan 26 14:51:52 compute-1 sudo[145968]: pam_unix(sudo:session): session closed for user root
Jan 26 14:51:53 compute-1 sudo[146121]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yynwlgljkekcufytbrpzbsbwcyihtacm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439112.9654849-1344-178027363948160/AnsiballZ_file.py'
Jan 26 14:51:53 compute-1 sudo[146121]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:51:53 compute-1 python3.9[146123]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:51:53 compute-1 sudo[146121]: pam_unix(sudo:session): session closed for user root
Jan 26 14:51:53 compute-1 sudo[146273]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-euhgjdqchuuoivyeprceqoftzpdbtmwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439113.6444092-1344-62840561661044/AnsiballZ_file.py'
Jan 26 14:51:53 compute-1 sudo[146273]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:51:54 compute-1 python3.9[146275]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:51:54 compute-1 sudo[146273]: pam_unix(sudo:session): session closed for user root
Jan 26 14:51:54 compute-1 sudo[146425]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghbqbntawvtyhsmdxjvlgeddnbykekar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439114.298303-1344-163728403553444/AnsiballZ_file.py'
Jan 26 14:51:54 compute-1 sudo[146425]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:51:54 compute-1 python3.9[146427]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:51:54 compute-1 sudo[146425]: pam_unix(sudo:session): session closed for user root
Jan 26 14:51:55 compute-1 sudo[146577]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovnmbpnsdmjputallywitsjwtuwkmygz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439114.9444022-1344-68083064331389/AnsiballZ_file.py'
Jan 26 14:51:55 compute-1 sudo[146577]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:51:55 compute-1 python3.9[146579]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:51:55 compute-1 sudo[146577]: pam_unix(sudo:session): session closed for user root
Jan 26 14:51:55 compute-1 sudo[146729]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kngviarztbyptiadmoncsznnvvqyvynh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439115.6194315-1344-134966089273838/AnsiballZ_file.py'
Jan 26 14:51:55 compute-1 sudo[146729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:51:55 compute-1 sshd-session[143635]: Invalid user azure from 185.246.128.170 port 37095
Jan 26 14:51:56 compute-1 python3.9[146731]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:51:56 compute-1 sudo[146729]: pam_unix(sudo:session): session closed for user root
Jan 26 14:51:56 compute-1 sudo[146881]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwsggswgirmacthrlveaezmebdhrfskh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439116.2280412-1344-114587895831104/AnsiballZ_file.py'
Jan 26 14:51:56 compute-1 sudo[146881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:51:56 compute-1 python3.9[146883]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:51:56 compute-1 sudo[146881]: pam_unix(sudo:session): session closed for user root
Jan 26 14:51:56 compute-1 sshd-session[143635]: Disconnecting invalid user azure 185.246.128.170 port 37095: Change of username or service not allowed: (azure,ssh-connection) -> (zhongwen,ssh-connection) [preauth]
Jan 26 14:51:57 compute-1 sudo[147035]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xghtmmjahklrqtzmwuibsevmpyybvvwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439116.9490998-1344-264075064419631/AnsiballZ_file.py'
Jan 26 14:51:57 compute-1 sudo[147035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:51:57 compute-1 python3.9[147037]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:51:57 compute-1 sudo[147035]: pam_unix(sudo:session): session closed for user root
Jan 26 14:51:57 compute-1 sshd-session[146968]: Invalid user zhongwen from 185.246.128.170 port 60696
Jan 26 14:51:57 compute-1 sudo[147187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbrtjuyybymjdlzwyuixhjcfkdsatsxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439117.62166-1344-44556854436465/AnsiballZ_file.py'
Jan 26 14:51:57 compute-1 sudo[147187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:51:58 compute-1 python3.9[147189]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:51:58 compute-1 sudo[147187]: pam_unix(sudo:session): session closed for user root
Jan 26 14:51:58 compute-1 sshd-session[146968]: Disconnecting invalid user zhongwen 185.246.128.170 port 60696: Change of username or service not allowed: (zhongwen,ssh-connection) -> (ADMIN,ssh-connection) [preauth]
Jan 26 14:51:58 compute-1 sudo[147339]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzxcrvulfuubarfdibfvvvtkqhxlyias ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439118.303631-1344-19336716806904/AnsiballZ_file.py'
Jan 26 14:51:58 compute-1 sudo[147339]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:51:58 compute-1 python3.9[147341]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:51:58 compute-1 sudo[147339]: pam_unix(sudo:session): session closed for user root
Jan 26 14:51:59 compute-1 sudo[147491]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ltcfmtyrnhjlsjdtdxyfgscelqfaxbtd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439118.9882295-1344-263777698616265/AnsiballZ_file.py'
Jan 26 14:51:59 compute-1 sudo[147491]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:51:59 compute-1 python3.9[147493]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:51:59 compute-1 sudo[147491]: pam_unix(sudo:session): session closed for user root
Jan 26 14:51:59 compute-1 sudo[147643]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvwgruwfluvitswgdbxtwrlzqnezwjeq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439119.5690975-1344-221827213586653/AnsiballZ_file.py'
Jan 26 14:51:59 compute-1 sudo[147643]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:00 compute-1 python3.9[147645]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:52:00 compute-1 sudo[147643]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:00 compute-1 sudo[147795]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etjbpvjdnchufqqjkazzgkphzbbhevjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439120.218902-1344-111971594730573/AnsiballZ_file.py'
Jan 26 14:52:00 compute-1 sudo[147795]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:00 compute-1 python3.9[147797]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:52:00 compute-1 sudo[147795]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:01 compute-1 sudo[147947]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnkmjdlmrgrpxnkdixgbejoqizzliawu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439120.87728-1344-101171620094635/AnsiballZ_file.py'
Jan 26 14:52:01 compute-1 sudo[147947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:01 compute-1 python3.9[147949]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:52:01 compute-1 sudo[147947]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:01 compute-1 sudo[148118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwfsluozxdtmcoxahuprwuhegcgurwbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439121.5567293-1344-46617557772226/AnsiballZ_file.py'
Jan 26 14:52:01 compute-1 sudo[148118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:01 compute-1 podman[148074]: 2026-01-26 14:52:01.863687989 +0000 UTC m=+0.073104157 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Jan 26 14:52:01 compute-1 podman[148073]: 2026-01-26 14:52:01.936600571 +0000 UTC m=+0.141528984 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest)
Jan 26 14:52:02 compute-1 python3.9[148126]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:52:02 compute-1 sudo[148118]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:02 compute-1 sudo[148290]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zoywzyhwyhxfpfngnooqjhtusxuqdqfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439122.2521849-1542-31356808660017/AnsiballZ_stat.py'
Jan 26 14:52:02 compute-1 sudo[148290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:02 compute-1 python3.9[148292]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:52:02 compute-1 sudo[148290]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:03 compute-1 sudo[148413]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dpozgxyelmoytcadcduxhphhkxirbsvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439122.2521849-1542-31356808660017/AnsiballZ_copy.py'
Jan 26 14:52:03 compute-1 sudo[148413]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:03 compute-1 python3.9[148415]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769439122.2521849-1542-31356808660017/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:52:03 compute-1 sudo[148413]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:03 compute-1 sudo[148566]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztiqayeegahjydckkbkbouhgwwtwlzoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439123.443681-1542-232496874791049/AnsiballZ_stat.py'
Jan 26 14:52:03 compute-1 sudo[148566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:03 compute-1 python3.9[148568]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:52:03 compute-1 sudo[148566]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:04 compute-1 sudo[148689]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zehxxsknycehudadioayiplbpjuirxzj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439123.443681-1542-232496874791049/AnsiballZ_copy.py'
Jan 26 14:52:04 compute-1 sudo[148689]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:04 compute-1 python3.9[148691]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769439123.443681-1542-232496874791049/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:52:04 compute-1 sudo[148689]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:04 compute-1 sudo[148841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fppzjiltppjxakeoazynrbgplkihjmca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439124.6847231-1542-279378768935067/AnsiballZ_stat.py'
Jan 26 14:52:04 compute-1 sudo[148841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:05 compute-1 python3.9[148843]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:52:05 compute-1 sudo[148841]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:05 compute-1 sudo[148964]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndopzkczlzomcxckijlujvmcofnfbieo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439124.6847231-1542-279378768935067/AnsiballZ_copy.py'
Jan 26 14:52:05 compute-1 sudo[148964]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:05 compute-1 python3.9[148966]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769439124.6847231-1542-279378768935067/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:52:05 compute-1 sudo[148964]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:06 compute-1 sudo[149117]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-stmwyhgyizpopmydpowvcjftecykftys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439125.951447-1542-251903141429609/AnsiballZ_stat.py'
Jan 26 14:52:06 compute-1 sudo[149117]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:06 compute-1 python3.9[149119]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:52:06 compute-1 sudo[149117]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:06 compute-1 sudo[149240]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjxnyklidioxloayfiyrqlcslaobhoau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439125.951447-1542-251903141429609/AnsiballZ_copy.py'
Jan 26 14:52:06 compute-1 sudo[149240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:07 compute-1 python3.9[149242]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769439125.951447-1542-251903141429609/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:52:07 compute-1 sudo[149240]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:07 compute-1 sudo[149392]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykpvayxprwauxnmxafvszrroqxmuxryx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439127.1809173-1542-982792309136/AnsiballZ_stat.py'
Jan 26 14:52:07 compute-1 sudo[149392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:07 compute-1 python3.9[149394]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:52:07 compute-1 sudo[149392]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:08 compute-1 sudo[149515]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqxaaayijzycwpscwptmnksxtqogwfab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439127.1809173-1542-982792309136/AnsiballZ_copy.py'
Jan 26 14:52:08 compute-1 sudo[149515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:08 compute-1 python3.9[149517]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769439127.1809173-1542-982792309136/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:52:08 compute-1 sudo[149515]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:08 compute-1 sudo[149667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzkshpbxdcnusdvmnayepwmcixzpchjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439128.4692223-1542-66794156941020/AnsiballZ_stat.py'
Jan 26 14:52:08 compute-1 sudo[149667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:08 compute-1 python3.9[149669]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:52:08 compute-1 sudo[149667]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:09 compute-1 sudo[149790]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgsqsknkbhxdsplzhfnlkhvihsenhgnb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439128.4692223-1542-66794156941020/AnsiballZ_copy.py'
Jan 26 14:52:09 compute-1 sudo[149790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:09 compute-1 python3.9[149792]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769439128.4692223-1542-66794156941020/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:52:09 compute-1 sudo[149790]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:09 compute-1 sudo[149942]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qscaljwcyzomgvjhaistzrazfehlqhsl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439129.6588657-1542-102260130710228/AnsiballZ_stat.py'
Jan 26 14:52:09 compute-1 sudo[149942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:10 compute-1 python3.9[149944]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:52:10 compute-1 sudo[149942]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:10 compute-1 sudo[150065]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlebntvqcfqdqyypeazqxxundhuvlooz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439129.6588657-1542-102260130710228/AnsiballZ_copy.py'
Jan 26 14:52:10 compute-1 sudo[150065]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:10 compute-1 python3.9[150067]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769439129.6588657-1542-102260130710228/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:52:10 compute-1 sudo[150065]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:11 compute-1 sudo[150217]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrpipsupmaehizicziswqnronesvfgbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439130.7599325-1542-12957152940853/AnsiballZ_stat.py'
Jan 26 14:52:11 compute-1 sudo[150217]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:11 compute-1 python3.9[150219]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:52:11 compute-1 sudo[150217]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:11 compute-1 sudo[150340]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjlfhclffgfbmhbyikngbrpglqcqrxjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439130.7599325-1542-12957152940853/AnsiballZ_copy.py'
Jan 26 14:52:11 compute-1 sudo[150340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:11 compute-1 python3.9[150342]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769439130.7599325-1542-12957152940853/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:52:11 compute-1 sudo[150340]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:12 compute-1 sudo[150492]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbgtycmgyygmwfpicxaoslpujljkgffi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439132.0042121-1542-82351573644149/AnsiballZ_stat.py'
Jan 26 14:52:12 compute-1 sudo[150492]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:12 compute-1 python3.9[150494]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:52:12 compute-1 sudo[150492]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:12 compute-1 sudo[150615]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmxnmmuwgaajnyzlbjtdkhbzzbufzirg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439132.0042121-1542-82351573644149/AnsiballZ_copy.py'
Jan 26 14:52:12 compute-1 sudo[150615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:13 compute-1 python3.9[150617]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769439132.0042121-1542-82351573644149/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:52:13 compute-1 sudo[150615]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:13 compute-1 sudo[150767]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-esitydwdyfkatseqrvceisuymwhvryed ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439133.3082712-1542-95809296203718/AnsiballZ_stat.py'
Jan 26 14:52:13 compute-1 sudo[150767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:13 compute-1 python3.9[150769]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:52:13 compute-1 sudo[150767]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:14 compute-1 sudo[150890]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eszkjggxcekvmunqrepbsagyaysmyjls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439133.3082712-1542-95809296203718/AnsiballZ_copy.py'
Jan 26 14:52:14 compute-1 sudo[150890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:14 compute-1 python3.9[150892]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769439133.3082712-1542-95809296203718/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:52:14 compute-1 sudo[150890]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:14 compute-1 sudo[151042]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwwvlhlxqdbnvcdwzgdvzinhclwcaxwe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439134.5437353-1542-224777056637258/AnsiballZ_stat.py'
Jan 26 14:52:14 compute-1 sudo[151042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:15 compute-1 python3.9[151044]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:52:15 compute-1 sudo[151042]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:15 compute-1 sudo[151165]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oybwirxwrzeimjgsodtwzpyrhokpgrqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439134.5437353-1542-224777056637258/AnsiballZ_copy.py'
Jan 26 14:52:15 compute-1 sudo[151165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:15 compute-1 sshd-session[148515]: Invalid user ADMIN from 185.246.128.170 port 56204
Jan 26 14:52:15 compute-1 python3.9[151167]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769439134.5437353-1542-224777056637258/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:52:15 compute-1 sudo[151165]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:16 compute-1 sudo[151317]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajdesrjoloqrjuzcfidvlxjvcwxhlrgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439135.8004405-1542-187133264562559/AnsiballZ_stat.py'
Jan 26 14:52:16 compute-1 sudo[151317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:16 compute-1 python3.9[151319]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:52:16 compute-1 sudo[151317]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:16 compute-1 sshd-session[148515]: Disconnecting invalid user ADMIN 185.246.128.170 port 56204: Change of username or service not allowed: (ADMIN,ssh-connection) -> (clouduser,ssh-connection) [preauth]
Jan 26 14:52:16 compute-1 sudo[151440]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvsmckxzzdkrqjlalkhkvysjazembbhz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439135.8004405-1542-187133264562559/AnsiballZ_copy.py'
Jan 26 14:52:16 compute-1 sudo[151440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:16 compute-1 python3.9[151442]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769439135.8004405-1542-187133264562559/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:52:16 compute-1 sudo[151440]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:17 compute-1 sudo[151592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwxbzxcejeeoldxcnekstcmkrcolvwlr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439137.0107236-1542-205194413918571/AnsiballZ_stat.py'
Jan 26 14:52:17 compute-1 sudo[151592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:17 compute-1 python3.9[151594]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:52:17 compute-1 sudo[151592]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:18 compute-1 sudo[151715]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwyhiladwiaqjlbagjyuotewivmkzykd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439137.0107236-1542-205194413918571/AnsiballZ_copy.py'
Jan 26 14:52:18 compute-1 sudo[151715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:18 compute-1 python3.9[151717]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769439137.0107236-1542-205194413918571/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:52:18 compute-1 sudo[151715]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:18 compute-1 sudo[151867]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-apgcnxwtororksqerfljrojeuxnwvfzb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439138.4261367-1542-50383487757894/AnsiballZ_stat.py'
Jan 26 14:52:18 compute-1 sudo[151867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:18 compute-1 python3.9[151869]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:52:18 compute-1 sudo[151867]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:19 compute-1 sudo[151990]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zybqtwkxukrtvnbybuzwmunajuhnreyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439138.4261367-1542-50383487757894/AnsiballZ_copy.py'
Jan 26 14:52:19 compute-1 sudo[151990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:19 compute-1 python3.9[151992]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769439138.4261367-1542-50383487757894/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:52:19 compute-1 sudo[151990]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:20 compute-1 python3.9[152142]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:52:20 compute-1 sudo[152295]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-srhbhfrqnlnrenvaqzzvnjrlbrzkckvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439140.4916003-1954-259303000533741/AnsiballZ_seboolean.py'
Jan 26 14:52:20 compute-1 sudo[152295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:21 compute-1 python3.9[152297]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Jan 26 14:52:22 compute-1 sudo[152295]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:22 compute-1 sudo[152452]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ocadzaivxhpyvhobzmjrtozpmmspbfdg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439142.5525548-1970-195670112763109/AnsiballZ_copy.py'
Jan 26 14:52:22 compute-1 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Jan 26 14:52:22 compute-1 sudo[152452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:23 compute-1 python3.9[152454]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:52:23 compute-1 sudo[152452]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:23 compute-1 sudo[152605]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aatliypkfkbdxtgieyammszpxoalxumz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439143.2475188-1970-157572769345852/AnsiballZ_copy.py'
Jan 26 14:52:23 compute-1 sudo[152605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:23 compute-1 python3.9[152607]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:52:23 compute-1 sudo[152605]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:24 compute-1 sudo[152757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjvuytoeavwejpxuvanjnpdszbnynnsv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439143.892034-1970-224640884000419/AnsiballZ_copy.py'
Jan 26 14:52:24 compute-1 sudo[152757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:24 compute-1 python3.9[152759]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:52:24 compute-1 sudo[152757]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:24 compute-1 sudo[152909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpjdbzxhiyeiekbpweiiafoggruwsfqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439144.5159776-1970-65677442875368/AnsiballZ_copy.py'
Jan 26 14:52:24 compute-1 sudo[152909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:24 compute-1 python3.9[152911]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:52:24 compute-1 sudo[152909]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:25 compute-1 sudo[153061]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ufkkznzxobbtuclbablceqpwpyiljcvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439145.115219-1970-227479500953186/AnsiballZ_copy.py'
Jan 26 14:52:25 compute-1 sudo[153061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:25 compute-1 python3.9[153063]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:52:25 compute-1 sudo[153061]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:25 compute-1 sshd-session[152298]: Invalid user clouduser from 185.246.128.170 port 43889
Jan 26 14:52:26 compute-1 sudo[153213]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozldaylqxcjqgkmjwvjriavyqrvhvtcr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439145.8407984-2042-171249498152512/AnsiballZ_copy.py'
Jan 26 14:52:26 compute-1 sudo[153213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:26 compute-1 python3.9[153215]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:52:26 compute-1 sudo[153213]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:26 compute-1 sudo[153365]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yucaffnoenmsqulrxnzvtjiwxqftqjwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439146.4669838-2042-173154140441589/AnsiballZ_copy.py'
Jan 26 14:52:26 compute-1 sudo[153365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:26 compute-1 python3.9[153367]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:52:26 compute-1 sshd-session[152298]: Disconnecting invalid user clouduser 185.246.128.170 port 43889: Change of username or service not allowed: (clouduser,ssh-connection) -> (smb,ssh-connection) [preauth]
Jan 26 14:52:26 compute-1 sudo[153365]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:27 compute-1 sudo[153517]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-psbrqzwioyojwlkxcuzpxkptwlbehpoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439147.065702-2042-39100085879104/AnsiballZ_copy.py'
Jan 26 14:52:27 compute-1 sudo[153517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:27 compute-1 python3.9[153519]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:52:27 compute-1 sudo[153517]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:28 compute-1 sudo[153669]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skawocobpljlacmlooyaevcbjicpmakg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439147.7387607-2042-238015121616480/AnsiballZ_copy.py'
Jan 26 14:52:28 compute-1 sudo[153669]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:28 compute-1 python3.9[153671]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:52:28 compute-1 sudo[153669]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:28 compute-1 sudo[153821]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxlpacbejrsvxvvjurfacrlclsysmfhm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439148.409985-2042-86713730700432/AnsiballZ_copy.py'
Jan 26 14:52:28 compute-1 sudo[153821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:28 compute-1 python3.9[153823]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:52:28 compute-1 sudo[153821]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:52:28.995 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 14:52:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:52:28.996 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 14:52:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:52:28.996 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 14:52:29 compute-1 sudo[153974]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zbeoacsqxcqhfygoywmzerbfmnwqgbqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439149.1901503-2114-215615387923800/AnsiballZ_systemd.py'
Jan 26 14:52:29 compute-1 sudo[153974]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:29 compute-1 python3.9[153976]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 14:52:29 compute-1 systemd[1]: Reloading.
Jan 26 14:52:30 compute-1 systemd-rc-local-generator[154004]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:52:30 compute-1 systemd-sysv-generator[154008]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 14:52:30 compute-1 systemd[1]: Starting libvirt logging daemon socket...
Jan 26 14:52:30 compute-1 systemd[1]: Listening on libvirt logging daemon socket.
Jan 26 14:52:30 compute-1 systemd[1]: Starting libvirt logging daemon admin socket...
Jan 26 14:52:30 compute-1 systemd[1]: Listening on libvirt logging daemon admin socket.
Jan 26 14:52:30 compute-1 systemd[1]: Starting libvirt logging daemon...
Jan 26 14:52:30 compute-1 systemd[1]: Started libvirt logging daemon.
Jan 26 14:52:30 compute-1 sudo[153974]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:30 compute-1 sudo[154168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-paxscspwejeixcqppsualyxthisrsjyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439150.4984689-2114-44141148905548/AnsiballZ_systemd.py'
Jan 26 14:52:30 compute-1 sudo[154168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:31 compute-1 python3.9[154170]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 14:52:31 compute-1 systemd[1]: Reloading.
Jan 26 14:52:31 compute-1 systemd-rc-local-generator[154198]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:52:31 compute-1 systemd-sysv-generator[154201]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 14:52:31 compute-1 systemd[1]: Starting libvirt nodedev daemon socket...
Jan 26 14:52:31 compute-1 systemd[1]: Listening on libvirt nodedev daemon socket.
Jan 26 14:52:31 compute-1 systemd[1]: Starting libvirt nodedev daemon admin socket...
Jan 26 14:52:31 compute-1 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Jan 26 14:52:31 compute-1 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Jan 26 14:52:31 compute-1 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Jan 26 14:52:31 compute-1 systemd[1]: Starting libvirt nodedev daemon...
Jan 26 14:52:31 compute-1 systemd[1]: Started libvirt nodedev daemon.
Jan 26 14:52:31 compute-1 sudo[154168]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:31 compute-1 sudo[154384]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbzddzkvoefyillgrjcqetrgkctexkdy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439151.603051-2114-212832049683908/AnsiballZ_systemd.py'
Jan 26 14:52:31 compute-1 sudo[154384]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:32 compute-1 podman[154386]: 2026-01-26 14:52:31.999733092 +0000 UTC m=+0.057515955 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 26 14:52:32 compute-1 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Jan 26 14:52:32 compute-1 podman[154406]: 2026-01-26 14:52:32.123274286 +0000 UTC m=+0.088978392 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 14:52:32 compute-1 python3.9[154387]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 14:52:32 compute-1 systemd[1]: Reloading.
Jan 26 14:52:32 compute-1 systemd-rc-local-generator[154461]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:52:32 compute-1 systemd-sysv-generator[154465]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 14:52:32 compute-1 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Jan 26 14:52:32 compute-1 systemd[1]: Starting libvirt proxy daemon admin socket...
Jan 26 14:52:32 compute-1 systemd[1]: Starting libvirt proxy daemon read-only socket...
Jan 26 14:52:32 compute-1 systemd[1]: Listening on libvirt proxy daemon admin socket.
Jan 26 14:52:32 compute-1 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Jan 26 14:52:32 compute-1 systemd[1]: Starting libvirt proxy daemon...
Jan 26 14:52:32 compute-1 systemd[1]: Started libvirt proxy daemon.
Jan 26 14:52:32 compute-1 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Jan 26 14:52:32 compute-1 sudo[154384]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:32 compute-1 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Jan 26 14:52:33 compute-1 sudo[154652]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwfockuakrobeenglmwyirlnnjlaadfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439152.685095-2114-239979617946258/AnsiballZ_systemd.py'
Jan 26 14:52:33 compute-1 sudo[154652]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:33 compute-1 python3.9[154654]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 14:52:33 compute-1 systemd[1]: Reloading.
Jan 26 14:52:33 compute-1 systemd-sysv-generator[154681]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 14:52:33 compute-1 systemd-rc-local-generator[154677]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:52:33 compute-1 setroubleshoot[154412]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l a5d36c8e-07b0-42b6-915e-fac7bcd52d6b
Jan 26 14:52:33 compute-1 setroubleshoot[154412]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Jan 26 14:52:33 compute-1 setroubleshoot[154412]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l a5d36c8e-07b0-42b6-915e-fac7bcd52d6b
Jan 26 14:52:33 compute-1 setroubleshoot[154412]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Jan 26 14:52:33 compute-1 systemd[1]: Listening on libvirt locking daemon socket.
Jan 26 14:52:33 compute-1 systemd[1]: Starting libvirt QEMU daemon socket...
Jan 26 14:52:33 compute-1 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Jan 26 14:52:33 compute-1 systemd[1]: Starting Virtual Machine and Container Registration Service...
Jan 26 14:52:33 compute-1 systemd[1]: Listening on libvirt QEMU daemon socket.
Jan 26 14:52:33 compute-1 systemd[1]: Starting libvirt QEMU daemon admin socket...
Jan 26 14:52:33 compute-1 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Jan 26 14:52:33 compute-1 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Jan 26 14:52:33 compute-1 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Jan 26 14:52:33 compute-1 systemd[1]: Started Virtual Machine and Container Registration Service.
Jan 26 14:52:33 compute-1 systemd[1]: Starting libvirt QEMU daemon...
Jan 26 14:52:33 compute-1 systemd[1]: Started libvirt QEMU daemon.
Jan 26 14:52:33 compute-1 sudo[154652]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:34 compute-1 sudo[154870]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vapkddikfmjqjywbsgvverfaulibaumm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439153.9207835-2114-3547020548472/AnsiballZ_systemd.py'
Jan 26 14:52:34 compute-1 sudo[154870]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:34 compute-1 python3.9[154872]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 14:52:34 compute-1 systemd[1]: Reloading.
Jan 26 14:52:34 compute-1 systemd-sysv-generator[154903]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 14:52:34 compute-1 systemd-rc-local-generator[154900]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:52:34 compute-1 systemd[1]: Starting libvirt secret daemon socket...
Jan 26 14:52:34 compute-1 systemd[1]: Listening on libvirt secret daemon socket.
Jan 26 14:52:34 compute-1 systemd[1]: Starting libvirt secret daemon admin socket...
Jan 26 14:52:34 compute-1 systemd[1]: Starting libvirt secret daemon read-only socket...
Jan 26 14:52:34 compute-1 systemd[1]: Listening on libvirt secret daemon admin socket.
Jan 26 14:52:34 compute-1 systemd[1]: Listening on libvirt secret daemon read-only socket.
Jan 26 14:52:34 compute-1 systemd[1]: Starting libvirt secret daemon...
Jan 26 14:52:34 compute-1 systemd[1]: Started libvirt secret daemon.
Jan 26 14:52:34 compute-1 sudo[154870]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:35 compute-1 sudo[155083]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgrufzmfwooavbxaxybsvmccjwflvikd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439155.3019245-2188-4090255104764/AnsiballZ_file.py'
Jan 26 14:52:35 compute-1 sudo[155083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:35 compute-1 python3.9[155085]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:52:35 compute-1 sudo[155083]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:36 compute-1 sudo[155235]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrstkgxsbjdmfkvqdnwylywkflrwzuor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439156.034649-2204-1880798495988/AnsiballZ_find.py'
Jan 26 14:52:36 compute-1 sudo[155235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:36 compute-1 python3.9[155237]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 26 14:52:36 compute-1 sudo[155235]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:37 compute-1 sudo[155387]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwnfcvzggphbjemgwunjffraoavffrvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439156.9985442-2232-153953624581530/AnsiballZ_stat.py'
Jan 26 14:52:37 compute-1 sudo[155387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:37 compute-1 python3.9[155389]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:52:37 compute-1 sudo[155387]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:37 compute-1 sudo[155510]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dsqyculmeoxtcwhbdxdkxztjersqeube ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439156.9985442-2232-153953624581530/AnsiballZ_copy.py'
Jan 26 14:52:37 compute-1 sudo[155510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:38 compute-1 python3.9[155512]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769439156.9985442-2232-153953624581530/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:52:38 compute-1 sudo[155510]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:38 compute-1 sudo[155662]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgxiidkmvqiuzmceofwfizbnrekrbolu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439158.4675188-2264-24738443040767/AnsiballZ_file.py'
Jan 26 14:52:38 compute-1 sudo[155662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:39 compute-1 python3.9[155664]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:52:39 compute-1 sudo[155662]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:39 compute-1 sshd-session[154655]: Invalid user smb from 185.246.128.170 port 40621
Jan 26 14:52:39 compute-1 sudo[155814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onhoydlevzhhlxcusdmkxxppnqpkrryt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439159.281164-2280-26211157646621/AnsiballZ_stat.py'
Jan 26 14:52:39 compute-1 sudo[155814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:39 compute-1 python3.9[155816]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:52:39 compute-1 sudo[155814]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:40 compute-1 sudo[155892]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfkaxhafwfcyiadeueopfxwawiuhidhx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439159.281164-2280-26211157646621/AnsiballZ_file.py'
Jan 26 14:52:40 compute-1 sudo[155892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:40 compute-1 sshd-session[154655]: Disconnecting invalid user smb 185.246.128.170 port 40621: Change of username or service not allowed: (smb,ssh-connection) -> (mongod,ssh-connection) [preauth]
Jan 26 14:52:40 compute-1 python3.9[155894]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:52:40 compute-1 sudo[155892]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:40 compute-1 sudo[156044]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-enwoijmlabltjhgxekqywaaqipmyrksn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439160.5968063-2305-206875478595699/AnsiballZ_stat.py'
Jan 26 14:52:40 compute-1 sudo[156044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:41 compute-1 python3.9[156046]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:52:41 compute-1 sudo[156044]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:41 compute-1 sudo[156122]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zyfdyrsemgbvzbjaadswgvxjjinrrptl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439160.5968063-2305-206875478595699/AnsiballZ_file.py'
Jan 26 14:52:41 compute-1 sudo[156122]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:41 compute-1 python3.9[156124]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.7gl8xlfr recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:52:41 compute-1 sudo[156122]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:42 compute-1 sudo[156274]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obktmwxeqdxloskjnjmrwgzjggdmngxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439161.8478723-2328-5167422615415/AnsiballZ_stat.py'
Jan 26 14:52:42 compute-1 sudo[156274]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:42 compute-1 python3.9[156276]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:52:42 compute-1 sudo[156274]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:42 compute-1 sudo[156352]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yukjblwowdaogqthacxdsutkydcacgyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439161.8478723-2328-5167422615415/AnsiballZ_file.py'
Jan 26 14:52:42 compute-1 sudo[156352]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:42 compute-1 python3.9[156354]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:52:42 compute-1 sudo[156352]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:43 compute-1 sudo[156504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-impmzcyfyxgziyrauerfptxzdkyggkcn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439163.0617423-2354-112377816950041/AnsiballZ_command.py'
Jan 26 14:52:43 compute-1 sudo[156504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:43 compute-1 python3.9[156506]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:52:43 compute-1 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Jan 26 14:52:43 compute-1 sudo[156504]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:43 compute-1 systemd[1]: setroubleshootd.service: Deactivated successfully.
Jan 26 14:52:44 compute-1 sudo[156657]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qtblnplvporagxqgvvytyyrjhnqekkuf ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769439163.7836132-2370-198652824669661/AnsiballZ_edpm_nftables_from_files.py'
Jan 26 14:52:44 compute-1 sudo[156657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:44 compute-1 python3[156659]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 26 14:52:44 compute-1 sudo[156657]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:45 compute-1 sudo[156810]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vetcnflfvvjucwtwqbbymoxqshnfypes ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439164.6988392-2387-147437316486356/AnsiballZ_stat.py'
Jan 26 14:52:45 compute-1 sudo[156810]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:45 compute-1 python3.9[156812]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:52:45 compute-1 sudo[156810]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:45 compute-1 sudo[156888]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhnjtxzobvhdtsjibldegnhovdrwvvqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439164.6988392-2387-147437316486356/AnsiballZ_file.py'
Jan 26 14:52:45 compute-1 sudo[156888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:45 compute-1 python3.9[156890]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:52:45 compute-1 sudo[156888]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:46 compute-1 sudo[157040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmxwjwokpczrvsybloeztcbhcvxvsbvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439165.9724712-2410-30501586588698/AnsiballZ_stat.py'
Jan 26 14:52:46 compute-1 sudo[157040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:46 compute-1 python3.9[157042]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:52:46 compute-1 sudo[157040]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:47 compute-1 sudo[157165]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwlauppiyyjxcmwkqrdhsedwyqcwpijd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439165.9724712-2410-30501586588698/AnsiballZ_copy.py'
Jan 26 14:52:47 compute-1 sudo[157165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:47 compute-1 python3.9[157167]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769439165.9724712-2410-30501586588698/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:52:47 compute-1 sudo[157165]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:47 compute-1 sudo[157317]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fayutlvcfzisczuuhhsayddckafalgfi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439167.4730015-2440-158109449142074/AnsiballZ_stat.py'
Jan 26 14:52:47 compute-1 sudo[157317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:47 compute-1 python3.9[157319]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:52:48 compute-1 sudo[157317]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:48 compute-1 sudo[157395]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-traceyfuemdbvdtqznxtptcejcilapqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439167.4730015-2440-158109449142074/AnsiballZ_file.py'
Jan 26 14:52:48 compute-1 sudo[157395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:48 compute-1 python3.9[157397]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:52:48 compute-1 sudo[157395]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:49 compute-1 sudo[157548]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvovkpezkydmgivdogoyqkfwgvnevekc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439168.6683354-2464-187954694977484/AnsiballZ_stat.py'
Jan 26 14:52:49 compute-1 sudo[157548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:49 compute-1 python3.9[157550]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:52:49 compute-1 sudo[157548]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:49 compute-1 sudo[157626]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jponmexectebgkmdrfxbojekhtfreuxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439168.6683354-2464-187954694977484/AnsiballZ_file.py'
Jan 26 14:52:49 compute-1 sudo[157626]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:49 compute-1 python3.9[157628]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:52:49 compute-1 sudo[157626]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:50 compute-1 sudo[157778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tcxqzosbinliqjoojdoknulgtvosgfsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439170.0071127-2488-205977944019277/AnsiballZ_stat.py'
Jan 26 14:52:50 compute-1 sudo[157778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:50 compute-1 python3.9[157780]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:52:50 compute-1 sudo[157778]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:50 compute-1 sudo[157903]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akfvccgwvpjyhrvlvoruohprywihzotm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439170.0071127-2488-205977944019277/AnsiballZ_copy.py'
Jan 26 14:52:50 compute-1 sudo[157903]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:51 compute-1 python3.9[157905]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769439170.0071127-2488-205977944019277/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:52:51 compute-1 sudo[157903]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:51 compute-1 sshd-session[156660]: Invalid user mongod from 185.246.128.170 port 62974
Jan 26 14:52:51 compute-1 sshd-session[156660]: Disconnecting invalid user mongod 185.246.128.170 port 62974: Change of username or service not allowed: (mongod,ssh-connection) -> (wuhan,ssh-connection) [preauth]
Jan 26 14:52:51 compute-1 sudo[158056]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oebqbnzihozkfkfjqajlpmugjbswyraq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439171.4014232-2518-195820726251808/AnsiballZ_file.py'
Jan 26 14:52:51 compute-1 sudo[158056]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:51 compute-1 python3.9[158058]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:52:51 compute-1 sudo[158056]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:52 compute-1 sudo[158208]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bljmzzbzkdfkvtrixgzlehjapmsnmfsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439172.1717722-2535-47513114089627/AnsiballZ_command.py'
Jan 26 14:52:52 compute-1 sudo[158208]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:52 compute-1 python3.9[158210]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:52:52 compute-1 sudo[158208]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:53 compute-1 sudo[158364]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qcipipkkqgyaglhfvwdjfgzzyabumsoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439172.9245045-2550-62858208272358/AnsiballZ_blockinfile.py'
Jan 26 14:52:53 compute-1 sudo[158364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:53 compute-1 python3.9[158366]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:52:53 compute-1 sudo[158364]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:54 compute-1 sudo[158516]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvdxrukazpnszexmjqdmyshemtdsgxib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439173.987435-2569-154666907893907/AnsiballZ_command.py'
Jan 26 14:52:54 compute-1 sudo[158516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:54 compute-1 python3.9[158518]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:52:54 compute-1 sudo[158516]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:55 compute-1 sudo[158669]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgwavmfffaepdhlxfnnvypzadtaogulv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439174.7719479-2584-147641379621133/AnsiballZ_stat.py'
Jan 26 14:52:55 compute-1 sudo[158669]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:55 compute-1 python3.9[158671]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 14:52:55 compute-1 sudo[158669]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:55 compute-1 sudo[158823]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nuapiyupiobmcoknkcwqvekewpwfflmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439175.6421802-2600-13956877650730/AnsiballZ_command.py'
Jan 26 14:52:55 compute-1 sudo[158823]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:56 compute-1 python3.9[158825]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:52:56 compute-1 sudo[158823]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:56 compute-1 sudo[158978]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cifvprddxrhtxtrknayrofigtmvoqayb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439176.413979-2616-272353157137970/AnsiballZ_file.py'
Jan 26 14:52:56 compute-1 sudo[158978]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:56 compute-1 python3.9[158980]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:52:56 compute-1 sudo[158978]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:57 compute-1 sudo[159130]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wyysqqorunkxxyyvlszewnmionzybkkt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439177.197586-2632-236809111548309/AnsiballZ_stat.py'
Jan 26 14:52:57 compute-1 sudo[159130]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:57 compute-1 python3.9[159132]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:52:57 compute-1 sudo[159130]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:58 compute-1 sshd-session[158036]: Invalid user wuhan from 185.246.128.170 port 19954
Jan 26 14:52:58 compute-1 sudo[159253]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejhssigjilnlrvjlmgiemxkmkfjtjpix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439177.197586-2632-236809111548309/AnsiballZ_copy.py'
Jan 26 14:52:58 compute-1 sudo[159253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:58 compute-1 python3.9[159255]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769439177.197586-2632-236809111548309/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:52:58 compute-1 sudo[159253]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:58 compute-1 sudo[159405]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftkgvlsowjiemlyuonpopwwqnxyawywu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439178.5541506-2663-94887127896993/AnsiballZ_stat.py'
Jan 26 14:52:58 compute-1 sudo[159405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:59 compute-1 python3.9[159407]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:52:59 compute-1 sudo[159405]: pam_unix(sudo:session): session closed for user root
Jan 26 14:52:59 compute-1 sudo[159528]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrhfduesunsvoerdewttbuqhappalanu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439178.5541506-2663-94887127896993/AnsiballZ_copy.py'
Jan 26 14:52:59 compute-1 sudo[159528]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:52:59 compute-1 sshd-session[158036]: Disconnecting invalid user wuhan 185.246.128.170 port 19954: Change of username or service not allowed: (wuhan,ssh-connection) -> (nagios,ssh-connection) [preauth]
Jan 26 14:52:59 compute-1 python3.9[159530]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769439178.5541506-2663-94887127896993/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:52:59 compute-1 sudo[159528]: pam_unix(sudo:session): session closed for user root
Jan 26 14:53:00 compute-1 sudo[159680]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjkmtmevihavtmjshgdmgyjekmxymzot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439179.8212337-2692-132551448556248/AnsiballZ_stat.py'
Jan 26 14:53:00 compute-1 sudo[159680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:53:00 compute-1 python3.9[159682]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:53:00 compute-1 sudo[159680]: pam_unix(sudo:session): session closed for user root
Jan 26 14:53:00 compute-1 sudo[159803]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxrxnvikkgvgjcxzdcljmfpnxspbtxro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439179.8212337-2692-132551448556248/AnsiballZ_copy.py'
Jan 26 14:53:00 compute-1 sudo[159803]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:53:01 compute-1 python3.9[159805]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769439179.8212337-2692-132551448556248/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:53:01 compute-1 sudo[159803]: pam_unix(sudo:session): session closed for user root
Jan 26 14:53:01 compute-1 sudo[159957]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uckswvjbvnciydouirpkndisqiwgaxxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439181.4567442-2722-246806746887368/AnsiballZ_systemd.py'
Jan 26 14:53:01 compute-1 sudo[159957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:53:02 compute-1 python3.9[159959]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 14:53:02 compute-1 systemd[1]: Reloading.
Jan 26 14:53:02 compute-1 systemd-rc-local-generator[160000]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:53:02 compute-1 systemd-sysv-generator[160005]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 14:53:02 compute-1 podman[159961]: 2026-01-26 14:53:02.198093746 +0000 UTC m=+0.078235106 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, managed_by=edpm_ansible, config_id=ovn_metadata_agent)
Jan 26 14:53:02 compute-1 systemd[1]: Reached target edpm_libvirt.target.
Jan 26 14:53:02 compute-1 sudo[159957]: pam_unix(sudo:session): session closed for user root
Jan 26 14:53:02 compute-1 podman[160014]: 2026-01-26 14:53:02.488122216 +0000 UTC m=+0.091894452 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 26 14:53:03 compute-1 sudo[160192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfqhfwrkuyslqdtmfssmhentwkdtyini ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439182.7434187-2739-196627503329238/AnsiballZ_systemd.py'
Jan 26 14:53:03 compute-1 sudo[160192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:53:03 compute-1 python3.9[160194]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 26 14:53:03 compute-1 systemd[1]: Reloading.
Jan 26 14:53:03 compute-1 systemd-rc-local-generator[160223]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:53:03 compute-1 systemd-sysv-generator[160228]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 14:53:03 compute-1 systemd[1]: Reloading.
Jan 26 14:53:03 compute-1 systemd-sysv-generator[160258]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 14:53:03 compute-1 systemd-rc-local-generator[160255]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:53:03 compute-1 sudo[160192]: pam_unix(sudo:session): session closed for user root
Jan 26 14:53:04 compute-1 sshd-session[105483]: Connection closed by 192.168.122.30 port 42468
Jan 26 14:53:04 compute-1 sshd-session[105480]: pam_unix(sshd:session): session closed for user zuul
Jan 26 14:53:04 compute-1 systemd[1]: session-24.scope: Deactivated successfully.
Jan 26 14:53:04 compute-1 systemd[1]: session-24.scope: Consumed 3min 25.843s CPU time.
Jan 26 14:53:04 compute-1 systemd-logind[795]: Session 24 logged out. Waiting for processes to exit.
Jan 26 14:53:04 compute-1 systemd-logind[795]: Removed session 24.
Jan 26 14:53:10 compute-1 sshd-session[160292]: Accepted publickey for zuul from 192.168.122.30 port 39638 ssh2: ECDSA SHA256:jdvz2lyr5EyIYu+bvBKw53Euf9HIejup115smDZmKeU
Jan 26 14:53:10 compute-1 systemd-logind[795]: New session 25 of user zuul.
Jan 26 14:53:10 compute-1 systemd[1]: Started Session 25 of User zuul.
Jan 26 14:53:10 compute-1 sshd-session[160292]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 14:53:11 compute-1 sshd-session[159830]: Invalid user nagios from 185.246.128.170 port 47306
Jan 26 14:53:11 compute-1 sshd-session[159830]: Disconnecting invalid user nagios 185.246.128.170 port 47306: Change of username or service not allowed: (nagios,ssh-connection) -> (router,ssh-connection) [preauth]
Jan 26 14:53:11 compute-1 python3.9[160445]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 14:53:12 compute-1 python3.9[160601]: ansible-ansible.builtin.service_facts Invoked
Jan 26 14:53:12 compute-1 network[160618]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 26 14:53:12 compute-1 network[160619]: 'network-scripts' will be removed from distribution in near future.
Jan 26 14:53:12 compute-1 network[160620]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 26 14:53:14 compute-1 sshd-session[160474]: Invalid user router from 185.246.128.170 port 18758
Jan 26 14:53:17 compute-1 sshd-session[160474]: Disconnecting invalid user router 185.246.128.170 port 18758: Change of username or service not allowed: (router,ssh-connection) -> (kafka,ssh-connection) [preauth]
Jan 26 14:53:18 compute-1 sudo[160889]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvcfczcbqyeudrjrruupyswoyhltuktk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439197.7891636-70-214233482524526/AnsiballZ_setup.py'
Jan 26 14:53:18 compute-1 sudo[160889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:53:18 compute-1 python3.9[160891]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 14:53:18 compute-1 sudo[160889]: pam_unix(sudo:session): session closed for user root
Jan 26 14:53:19 compute-1 sudo[160974]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sngqollscfgtgaffmhjigfictxeptzbt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439197.7891636-70-214233482524526/AnsiballZ_dnf.py'
Jan 26 14:53:19 compute-1 sudo[160974]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:53:19 compute-1 python3.9[160976]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 14:53:22 compute-1 sshd-session[160898]: Invalid user kafka from 185.246.128.170 port 50768
Jan 26 14:53:23 compute-1 sshd-session[160898]: Disconnecting invalid user kafka 185.246.128.170 port 50768: Change of username or service not allowed: (kafka,ssh-connection) -> (dev,ssh-connection) [preauth]
Jan 26 14:53:24 compute-1 sudo[160974]: pam_unix(sudo:session): session closed for user root
Jan 26 14:53:25 compute-1 sudo[161131]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zpxjeqxppvcfbkfimhkfoqollomxccqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439205.2270048-94-83037444865747/AnsiballZ_stat.py'
Jan 26 14:53:25 compute-1 sudo[161131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:53:25 compute-1 python3.9[161133]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 14:53:25 compute-1 sudo[161131]: pam_unix(sudo:session): session closed for user root
Jan 26 14:53:26 compute-1 sudo[161283]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-webeluzsbpjzqufpkfpftlpzvovntcym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439206.1363838-115-270356939049505/AnsiballZ_command.py'
Jan 26 14:53:26 compute-1 sudo[161283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:53:26 compute-1 python3.9[161285]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:53:26 compute-1 sudo[161283]: pam_unix(sudo:session): session closed for user root
Jan 26 14:53:27 compute-1 sudo[161436]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qshxjrlytfofqpcbodwfbpqsejqwlhdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439207.0463686-134-85607951026161/AnsiballZ_stat.py'
Jan 26 14:53:27 compute-1 sudo[161436]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:53:27 compute-1 python3.9[161438]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 14:53:27 compute-1 sudo[161436]: pam_unix(sudo:session): session closed for user root
Jan 26 14:53:28 compute-1 sudo[161588]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvunqrkuihluokrqmgokonzgnmtearhh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439207.7343383-150-209500097586076/AnsiballZ_command.py'
Jan 26 14:53:28 compute-1 sudo[161588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:53:28 compute-1 python3.9[161590]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:53:28 compute-1 sudo[161588]: pam_unix(sudo:session): session closed for user root
Jan 26 14:53:28 compute-1 sudo[161741]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osiadawnvyxqsbnganfnhlxdxplvusoy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439208.4369628-166-96569730098485/AnsiballZ_stat.py'
Jan 26 14:53:28 compute-1 sudo[161741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:53:28 compute-1 python3.9[161743]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:53:28 compute-1 sudo[161741]: pam_unix(sudo:session): session closed for user root
Jan 26 14:53:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:53:28.998 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 14:53:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:53:28.999 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 14:53:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:53:28.999 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 14:53:29 compute-1 sudo[161865]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbqhtyvralitocpnaipboftgifelmiqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439208.4369628-166-96569730098485/AnsiballZ_copy.py'
Jan 26 14:53:29 compute-1 sudo[161865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:53:29 compute-1 python3.9[161867]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769439208.4369628-166-96569730098485/.source.iscsi _original_basename=.aup0s7mf follow=False checksum=edf25c35e2014cc40ad4d5891a9d34e3a6c7eb50 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:53:29 compute-1 sudo[161865]: pam_unix(sudo:session): session closed for user root
Jan 26 14:53:30 compute-1 sudo[162017]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-csqxiyoncszdlxoemcxbafppailwaefc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439210.3392098-196-225570095706375/AnsiballZ_file.py'
Jan 26 14:53:30 compute-1 sudo[162017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:53:30 compute-1 python3.9[162019]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:53:30 compute-1 sudo[162017]: pam_unix(sudo:session): session closed for user root
Jan 26 14:53:32 compute-1 sudo[162196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzxxwbzkxwtbuzqbsqgbfeemctscbxax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439211.1787336-212-143616194520304/AnsiballZ_lineinfile.py'
Jan 26 14:53:32 compute-1 podman[162144]: 2026-01-26 14:53:32.625770147 +0000 UTC m=+0.055047527 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2)
Jan 26 14:53:32 compute-1 sudo[162196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:53:32 compute-1 podman[162143]: 2026-01-26 14:53:32.653127271 +0000 UTC m=+0.091865672 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 26 14:53:32 compute-1 python3.9[162211]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:53:32 compute-1 sudo[162196]: pam_unix(sudo:session): session closed for user root
Jan 26 14:53:32 compute-1 sshd-session[161004]: Invalid user dev from 185.246.128.170 port 61494
Jan 26 14:53:33 compute-1 sudo[162365]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qviccuxkggwrryhggjxukecgnywvomnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439213.0993376-230-25140703544803/AnsiballZ_systemd_service.py'
Jan 26 14:53:33 compute-1 sudo[162365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:53:33 compute-1 python3.9[162367]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 14:53:33 compute-1 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Jan 26 14:53:34 compute-1 sudo[162365]: pam_unix(sudo:session): session closed for user root
Jan 26 14:53:34 compute-1 sudo[162521]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijgmazbqcrsehowmltbhjyddazehhubn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439214.2032943-246-85864994931176/AnsiballZ_systemd_service.py'
Jan 26 14:53:34 compute-1 sudo[162521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:53:34 compute-1 python3.9[162523]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 14:53:34 compute-1 systemd[1]: Reloading.
Jan 26 14:53:35 compute-1 systemd-rc-local-generator[162553]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:53:35 compute-1 systemd-sysv-generator[162556]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 14:53:35 compute-1 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 26 14:53:35 compute-1 systemd[1]: Starting Open-iSCSI...
Jan 26 14:53:35 compute-1 kernel: Loading iSCSI transport class v2.0-870.
Jan 26 14:53:35 compute-1 systemd[1]: Started Open-iSCSI.
Jan 26 14:53:35 compute-1 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Jan 26 14:53:35 compute-1 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Jan 26 14:53:35 compute-1 sudo[162521]: pam_unix(sudo:session): session closed for user root
Jan 26 14:53:36 compute-1 python3.9[162722]: ansible-ansible.builtin.service_facts Invoked
Jan 26 14:53:36 compute-1 network[162739]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 26 14:53:36 compute-1 network[162740]: 'network-scripts' will be removed from distribution in near future.
Jan 26 14:53:36 compute-1 network[162741]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 26 14:53:36 compute-1 sshd-session[161004]: Disconnecting invalid user dev 185.246.128.170 port 61494: Change of username or service not allowed: (dev,ssh-connection) -> (123,ssh-connection) [preauth]
Jan 26 14:53:41 compute-1 sudo[163010]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ayqojseaztvinolfozuvvfnrfnkjnacy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439221.5603123-292-182894503954630/AnsiballZ_dnf.py'
Jan 26 14:53:41 compute-1 sudo[163010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:53:42 compute-1 python3.9[163012]: ansible-ansible.legacy.dnf Invoked with name=['device-mapper-multipath'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 14:53:44 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 26 14:53:44 compute-1 systemd[1]: Starting man-db-cache-update.service...
Jan 26 14:53:44 compute-1 systemd[1]: Reloading.
Jan 26 14:53:44 compute-1 systemd-rc-local-generator[163057]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:53:44 compute-1 systemd-sysv-generator[163061]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 14:53:44 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 26 14:53:45 compute-1 sudo[163010]: pam_unix(sudo:session): session closed for user root
Jan 26 14:53:45 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 26 14:53:45 compute-1 systemd[1]: Finished man-db-cache-update.service.
Jan 26 14:53:45 compute-1 systemd[1]: run-r9c1acea545204f048078b37f60f9339d.service: Deactivated successfully.
Jan 26 14:53:46 compute-1 sudo[163326]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-npvwcuveotkznrsysfxqswsedrqzqeqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439225.9235635-310-218252251011025/AnsiballZ_file.py'
Jan 26 14:53:46 compute-1 sudo[163326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:53:46 compute-1 python3.9[163328]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 26 14:53:46 compute-1 sudo[163326]: pam_unix(sudo:session): session closed for user root
Jan 26 14:53:46 compute-1 sudo[163480]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tnveozvqwcpymgzhszjptmxkrstpqxoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439226.5781722-326-228088660168907/AnsiballZ_modprobe.py'
Jan 26 14:53:46 compute-1 sudo[163480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:53:47 compute-1 python3.9[163482]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Jan 26 14:53:47 compute-1 sudo[163480]: pam_unix(sudo:session): session closed for user root
Jan 26 14:53:47 compute-1 sudo[163636]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjjxlpbggbdyeqwxthyqqmqzqjlyihyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439227.3657265-342-65033869115157/AnsiballZ_stat.py'
Jan 26 14:53:47 compute-1 sudo[163636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:53:47 compute-1 python3.9[163638]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:53:47 compute-1 sudo[163636]: pam_unix(sudo:session): session closed for user root
Jan 26 14:53:48 compute-1 sudo[163759]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umucdjgizuzlzjchimjirvmfnxxmybfs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439227.3657265-342-65033869115157/AnsiballZ_copy.py'
Jan 26 14:53:48 compute-1 sudo[163759]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:53:48 compute-1 python3.9[163761]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769439227.3657265-342-65033869115157/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:53:48 compute-1 sudo[163759]: pam_unix(sudo:session): session closed for user root
Jan 26 14:53:49 compute-1 sudo[163912]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sszusaerfqovamccdcjtmrcqfkfhiwqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439228.7783244-374-16925600707805/AnsiballZ_lineinfile.py'
Jan 26 14:53:49 compute-1 sudo[163912]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:53:49 compute-1 python3.9[163914]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:53:49 compute-1 sudo[163912]: pam_unix(sudo:session): session closed for user root
Jan 26 14:53:50 compute-1 sudo[164065]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhqzmfizvdkirjtvnvyqgeapebnpwfka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439229.4667983-390-267722398608270/AnsiballZ_systemd.py'
Jan 26 14:53:50 compute-1 sudo[164065]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:53:50 compute-1 python3.9[164067]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 14:53:50 compute-1 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 26 14:53:50 compute-1 systemd[1]: Stopped Load Kernel Modules.
Jan 26 14:53:50 compute-1 systemd[1]: Stopping Load Kernel Modules...
Jan 26 14:53:50 compute-1 systemd[1]: Starting Load Kernel Modules...
Jan 26 14:53:50 compute-1 systemd[1]: Finished Load Kernel Modules.
Jan 26 14:53:50 compute-1 sudo[164065]: pam_unix(sudo:session): session closed for user root
Jan 26 14:53:50 compute-1 sudo[164221]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avtmmlqtvwifezfykvhidjlcsoygosls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439230.6742911-406-196613964770643/AnsiballZ_command.py'
Jan 26 14:53:50 compute-1 sudo[164221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:53:51 compute-1 python3.9[164223]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/multipath _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:53:51 compute-1 sudo[164221]: pam_unix(sudo:session): session closed for user root
Jan 26 14:53:51 compute-1 sudo[164374]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhunisejglgvqrkmfbwskfdadqgovupv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439231.4837086-426-195339920968740/AnsiballZ_stat.py'
Jan 26 14:53:51 compute-1 sudo[164374]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:53:51 compute-1 sshd-session[163885]: Connection reset by authenticating user root 176.120.22.13 port 50468 [preauth]
Jan 26 14:53:51 compute-1 python3.9[164376]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 14:53:51 compute-1 sudo[164374]: pam_unix(sudo:session): session closed for user root
Jan 26 14:53:52 compute-1 sshd-session[163329]: Invalid user 123 from 185.246.128.170 port 18440
Jan 26 14:53:52 compute-1 sshd-session[163329]: Disconnecting invalid user 123 185.246.128.170 port 18440: Change of username or service not allowed: (123,ssh-connection) -> (stack,ssh-connection) [preauth]
Jan 26 14:53:52 compute-1 sudo[164528]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lrewmswvcorzsmtiymqathnmunwyhiwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439232.3571513-444-48417884808298/AnsiballZ_stat.py'
Jan 26 14:53:52 compute-1 sudo[164528]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:53:52 compute-1 python3.9[164530]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:53:52 compute-1 sudo[164528]: pam_unix(sudo:session): session closed for user root
Jan 26 14:53:53 compute-1 sudo[164651]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxxxlopkvvfubkiooxkxebhctmeqgwcn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439232.3571513-444-48417884808298/AnsiballZ_copy.py'
Jan 26 14:53:53 compute-1 sudo[164651]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:53:53 compute-1 python3.9[164653]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769439232.3571513-444-48417884808298/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:53:53 compute-1 sudo[164651]: pam_unix(sudo:session): session closed for user root
Jan 26 14:53:53 compute-1 sudo[164803]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkzluygsmontmjmujppimqxmkbapknuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439233.6092663-474-179640405864800/AnsiballZ_command.py'
Jan 26 14:53:53 compute-1 sudo[164803]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:53:54 compute-1 sshd-session[164377]: Invalid user vpn from 176.120.22.13 port 54198
Jan 26 14:53:54 compute-1 python3.9[164805]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:53:54 compute-1 sudo[164803]: pam_unix(sudo:session): session closed for user root
Jan 26 14:53:54 compute-1 sshd-session[164377]: Connection reset by invalid user vpn 176.120.22.13 port 54198 [preauth]
Jan 26 14:53:54 compute-1 sudo[164957]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rblzqxydojtafurkfmsdxbuextltedhe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439234.3687043-490-27290864522668/AnsiballZ_lineinfile.py'
Jan 26 14:53:54 compute-1 sudo[164957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:53:54 compute-1 python3.9[164959]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:53:54 compute-1 sudo[164957]: pam_unix(sudo:session): session closed for user root
Jan 26 14:53:55 compute-1 sudo[165110]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rppqyisdsgzxxjnaqfmpmopcqfjiocgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439235.2118216-506-182097412397556/AnsiballZ_replace.py'
Jan 26 14:53:55 compute-1 sudo[165110]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:53:55 compute-1 python3.9[165112]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:53:55 compute-1 sudo[165110]: pam_unix(sudo:session): session closed for user root
Jan 26 14:53:55 compute-1 sshd-session[164906]: Invalid user ubnt from 176.120.22.13 port 54200
Jan 26 14:53:56 compute-1 sshd-session[164906]: Connection reset by invalid user ubnt 176.120.22.13 port 54200 [preauth]
Jan 26 14:53:56 compute-1 sudo[165262]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-alddydrsbxpaehqktkxwpyspamzvduvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439236.0915895-522-117151262014769/AnsiballZ_replace.py'
Jan 26 14:53:56 compute-1 sudo[165262]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:53:56 compute-1 python3.9[165264]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:53:56 compute-1 sudo[165262]: pam_unix(sudo:session): session closed for user root
Jan 26 14:53:57 compute-1 sudo[165416]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uflbvrcqsfeyguqcahmmdptiglmuwxin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439236.891692-540-45820966398346/AnsiballZ_lineinfile.py'
Jan 26 14:53:57 compute-1 sudo[165416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:53:57 compute-1 python3.9[165418]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:53:57 compute-1 sudo[165416]: pam_unix(sudo:session): session closed for user root
Jan 26 14:53:57 compute-1 sudo[165568]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wowhvxdpdgrwbfcdftpemmjifripfams ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439237.6644351-540-52412299317760/AnsiballZ_lineinfile.py'
Jan 26 14:53:57 compute-1 sudo[165568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:53:58 compute-1 python3.9[165570]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:53:58 compute-1 sudo[165568]: pam_unix(sudo:session): session closed for user root
Jan 26 14:53:58 compute-1 sudo[165720]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-crytqtwzomllfyfsmgrkvggticgfrnom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439238.325355-540-89509427293236/AnsiballZ_lineinfile.py'
Jan 26 14:53:58 compute-1 sudo[165720]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:53:58 compute-1 sshd-session[165265]: Connection reset by authenticating user root 176.120.22.13 port 54220 [preauth]
Jan 26 14:53:58 compute-1 python3.9[165722]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:53:58 compute-1 sudo[165720]: pam_unix(sudo:session): session closed for user root
Jan 26 14:53:59 compute-1 sudo[165873]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ljfpsrbywlhsaykwyuasnokrpnzgmjjz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439239.0013604-540-6068922409458/AnsiballZ_lineinfile.py'
Jan 26 14:53:59 compute-1 sudo[165873]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:53:59 compute-1 python3.9[165875]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:53:59 compute-1 sudo[165873]: pam_unix(sudo:session): session closed for user root
Jan 26 14:54:00 compute-1 sudo[166026]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hpbqnbajxygwwecgklpilbrguysvaaoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439239.7024279-598-164427609801973/AnsiballZ_stat.py'
Jan 26 14:54:00 compute-1 sudo[166026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:54:00 compute-1 python3.9[166028]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 14:54:00 compute-1 sudo[166026]: pam_unix(sudo:session): session closed for user root
Jan 26 14:54:00 compute-1 sudo[166181]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmrqrahjspkbikquadzdaqgexpannemj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439240.417445-614-36179041080092/AnsiballZ_command.py'
Jan 26 14:54:00 compute-1 sudo[166181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:54:00 compute-1 python3.9[166183]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/true _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:54:00 compute-1 sudo[166181]: pam_unix(sudo:session): session closed for user root
Jan 26 14:54:01 compute-1 sudo[166334]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhxosiufmrvjlqldeazfdbfeozqjmnra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439241.205549-632-227894416614014/AnsiballZ_systemd_service.py'
Jan 26 14:54:01 compute-1 sudo[166334]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:54:01 compute-1 sshd-session[165746]: Connection reset by authenticating user root 176.120.22.13 port 54224 [preauth]
Jan 26 14:54:01 compute-1 python3.9[166336]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 14:54:01 compute-1 systemd[1]: Listening on multipathd control socket.
Jan 26 14:54:01 compute-1 sudo[166334]: pam_unix(sudo:session): session closed for user root
Jan 26 14:54:02 compute-1 sudo[166490]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydiefqpzwqteosaoxafpowquxnwsafwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439242.1096537-648-78582813700121/AnsiballZ_systemd_service.py'
Jan 26 14:54:02 compute-1 sudo[166490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:54:02 compute-1 python3.9[166492]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 14:54:02 compute-1 systemd[1]: Starting Wait for udev To Complete Device Initialization...
Jan 26 14:54:02 compute-1 udevadm[166517]: systemd-udev-settle.service is deprecated. Please fix multipathd.service not to pull it in.
Jan 26 14:54:02 compute-1 systemd[1]: Finished Wait for udev To Complete Device Initialization.
Jan 26 14:54:02 compute-1 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 26 14:54:02 compute-1 podman[166495]: 2026-01-26 14:54:02.844445464 +0000 UTC m=+0.101953730 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Jan 26 14:54:02 compute-1 podman[166494]: 2026-01-26 14:54:02.853117569 +0000 UTC m=+0.110393299 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Jan 26 14:54:02 compute-1 multipathd[166545]: --------start up--------
Jan 26 14:54:02 compute-1 multipathd[166545]: read /etc/multipath.conf
Jan 26 14:54:02 compute-1 multipathd[166545]: path checkers start up
Jan 26 14:54:02 compute-1 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 26 14:54:02 compute-1 sudo[166490]: pam_unix(sudo:session): session closed for user root
Jan 26 14:54:03 compute-1 sudo[166702]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihswgtqbbpbewmipmncfztxskfllxpab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439243.5087056-672-41736735774293/AnsiballZ_file.py'
Jan 26 14:54:03 compute-1 sudo[166702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:54:04 compute-1 python3.9[166704]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 26 14:54:04 compute-1 sudo[166702]: pam_unix(sudo:session): session closed for user root
Jan 26 14:54:04 compute-1 sudo[166855]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axhtvnnubrfibmsamfbtdjtsmmufhknr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439244.3069575-688-154541432317560/AnsiballZ_modprobe.py'
Jan 26 14:54:04 compute-1 sudo[166855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:54:04 compute-1 python3.9[166857]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Jan 26 14:54:04 compute-1 kernel: Key type psk registered
Jan 26 14:54:04 compute-1 sudo[166855]: pam_unix(sudo:session): session closed for user root
Jan 26 14:54:05 compute-1 sudo[167018]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfnjqfzymezcgiphpdfxdzvxcihtrvbg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439245.004516-704-28561443038526/AnsiballZ_stat.py'
Jan 26 14:54:05 compute-1 sudo[167018]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:54:05 compute-1 python3.9[167020]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:54:05 compute-1 sudo[167018]: pam_unix(sudo:session): session closed for user root
Jan 26 14:54:05 compute-1 sudo[167141]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwlplkxhatitedeufcrwgdakqxohmtyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439245.004516-704-28561443038526/AnsiballZ_copy.py'
Jan 26 14:54:05 compute-1 sudo[167141]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:54:06 compute-1 python3.9[167143]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769439245.004516-704-28561443038526/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:54:06 compute-1 sudo[167141]: pam_unix(sudo:session): session closed for user root
Jan 26 14:54:06 compute-1 sudo[167293]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-artrymsqlssytsfenrwoxpoackmdvudw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439246.4050357-736-247929778701141/AnsiballZ_lineinfile.py'
Jan 26 14:54:06 compute-1 sudo[167293]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:54:06 compute-1 python3.9[167295]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:54:06 compute-1 sudo[167293]: pam_unix(sudo:session): session closed for user root
Jan 26 14:54:07 compute-1 sudo[167445]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwjnfavkjvlfqruewqaevnpsqjjzhwvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439247.1156287-752-186569025515525/AnsiballZ_systemd.py'
Jan 26 14:54:07 compute-1 sudo[167445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:54:07 compute-1 python3.9[167447]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 14:54:07 compute-1 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 26 14:54:07 compute-1 systemd[1]: Stopped Load Kernel Modules.
Jan 26 14:54:07 compute-1 systemd[1]: Stopping Load Kernel Modules...
Jan 26 14:54:07 compute-1 systemd[1]: Starting Load Kernel Modules...
Jan 26 14:54:07 compute-1 systemd[1]: Finished Load Kernel Modules.
Jan 26 14:54:07 compute-1 sudo[167445]: pam_unix(sudo:session): session closed for user root
Jan 26 14:54:08 compute-1 sshd-session[166052]: Invalid user stack from 185.246.128.170 port 42389
Jan 26 14:54:08 compute-1 sshd-session[166052]: Disconnecting invalid user stack 185.246.128.170 port 42389: Change of username or service not allowed: (stack,ssh-connection) -> (note,ssh-connection) [preauth]
Jan 26 14:54:08 compute-1 sudo[167601]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjvgeedzckzzcxlbssopxzijlfkaicog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439248.263989-768-252433800921191/AnsiballZ_dnf.py'
Jan 26 14:54:08 compute-1 sudo[167601]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:54:08 compute-1 python3.9[167603]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 14:54:11 compute-1 systemd[1]: Reloading.
Jan 26 14:54:11 compute-1 systemd-sysv-generator[167638]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 14:54:11 compute-1 systemd-rc-local-generator[167635]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:54:11 compute-1 systemd[1]: Reloading.
Jan 26 14:54:12 compute-1 systemd-sysv-generator[167675]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 14:54:12 compute-1 systemd-rc-local-generator[167671]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:54:12 compute-1 systemd-logind[795]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 26 14:54:12 compute-1 systemd-logind[795]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 26 14:54:12 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 26 14:54:12 compute-1 systemd[1]: Starting man-db-cache-update.service...
Jan 26 14:54:12 compute-1 systemd[1]: Reloading.
Jan 26 14:54:12 compute-1 systemd-rc-local-generator[167768]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:54:12 compute-1 systemd-sysv-generator[167771]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 14:54:13 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 26 14:54:13 compute-1 sudo[167601]: pam_unix(sudo:session): session closed for user root
Jan 26 14:54:14 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 26 14:54:14 compute-1 systemd[1]: Finished man-db-cache-update.service.
Jan 26 14:54:14 compute-1 systemd[1]: man-db-cache-update.service: Consumed 1.506s CPU time.
Jan 26 14:54:14 compute-1 systemd[1]: run-raa06f621d9a5420981e315af47df1585.service: Deactivated successfully.
Jan 26 14:54:14 compute-1 sudo[169067]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rrdutzdjdivuaapzjjzxdybdjtpcnejb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439254.0261984-784-88828512206931/AnsiballZ_systemd_service.py'
Jan 26 14:54:14 compute-1 sudo[169067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:54:14 compute-1 sshd-session[167723]: Invalid user note from 185.246.128.170 port 35755
Jan 26 14:54:14 compute-1 python3.9[169069]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 14:54:14 compute-1 systemd[1]: Stopping Open-iSCSI...
Jan 26 14:54:14 compute-1 iscsid[162563]: iscsid shutting down.
Jan 26 14:54:14 compute-1 systemd[1]: iscsid.service: Deactivated successfully.
Jan 26 14:54:14 compute-1 systemd[1]: Stopped Open-iSCSI.
Jan 26 14:54:14 compute-1 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 26 14:54:14 compute-1 systemd[1]: Starting Open-iSCSI...
Jan 26 14:54:14 compute-1 systemd[1]: Started Open-iSCSI.
Jan 26 14:54:14 compute-1 sudo[169067]: pam_unix(sudo:session): session closed for user root
Jan 26 14:54:15 compute-1 sudo[169223]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tehoxagrbdetocbglmdcobjqtdpjnpfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439254.9838061-800-138991465683180/AnsiballZ_systemd_service.py'
Jan 26 14:54:15 compute-1 sudo[169223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:54:15 compute-1 python3.9[169225]: ansible-ansible.builtin.systemd_service Invoked with name=multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 14:54:15 compute-1 systemd[1]: Stopping Device-Mapper Multipath Device Controller...
Jan 26 14:54:15 compute-1 multipathd[166545]: exit (signal)
Jan 26 14:54:15 compute-1 multipathd[166545]: --------shut down-------
Jan 26 14:54:15 compute-1 systemd[1]: multipathd.service: Deactivated successfully.
Jan 26 14:54:15 compute-1 systemd[1]: Stopped Device-Mapper Multipath Device Controller.
Jan 26 14:54:15 compute-1 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 26 14:54:15 compute-1 multipathd[169231]: --------start up--------
Jan 26 14:54:15 compute-1 multipathd[169231]: read /etc/multipath.conf
Jan 26 14:54:15 compute-1 multipathd[169231]: path checkers start up
Jan 26 14:54:15 compute-1 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 26 14:54:15 compute-1 sudo[169223]: pam_unix(sudo:session): session closed for user root
Jan 26 14:54:16 compute-1 python3.9[169388]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 14:54:16 compute-1 sshd-session[167723]: Disconnecting invalid user note 185.246.128.170 port 35755: Change of username or service not allowed: (note,ssh-connection) -> (test,ssh-connection) [preauth]
Jan 26 14:54:17 compute-1 sudo[169542]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iaxqwiysvyqoegvogsqjttonvfjygzdk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439257.1349237-835-116618359292745/AnsiballZ_file.py'
Jan 26 14:54:17 compute-1 sudo[169542]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:54:17 compute-1 python3.9[169544]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:54:17 compute-1 sudo[169542]: pam_unix(sudo:session): session closed for user root
Jan 26 14:54:18 compute-1 sudo[169694]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pltialouudwacghapkrjkbhrzxcodiih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439258.167166-857-147533562389201/AnsiballZ_systemd_service.py'
Jan 26 14:54:18 compute-1 sudo[169694]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:54:18 compute-1 python3.9[169696]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 14:54:18 compute-1 systemd[1]: Reloading.
Jan 26 14:54:18 compute-1 systemd-rc-local-generator[169717]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:54:18 compute-1 systemd-sysv-generator[169723]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 14:54:19 compute-1 sudo[169694]: pam_unix(sudo:session): session closed for user root
Jan 26 14:54:19 compute-1 python3.9[169880]: ansible-ansible.builtin.service_facts Invoked
Jan 26 14:54:19 compute-1 network[169897]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 26 14:54:19 compute-1 network[169898]: 'network-scripts' will be removed from distribution in near future.
Jan 26 14:54:19 compute-1 network[169899]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 26 14:54:26 compute-1 sudo[170169]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvzkhdrlqepsbcmttnmnyjzrppbsxngh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439266.372184-895-139339510148045/AnsiballZ_systemd_service.py'
Jan 26 14:54:26 compute-1 sudo[170169]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:54:27 compute-1 python3.9[170171]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 14:54:27 compute-1 sudo[170169]: pam_unix(sudo:session): session closed for user root
Jan 26 14:54:27 compute-1 sudo[170322]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrntwdsolirylpwelplgxguvlejdvbmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439267.2122886-895-229976036837648/AnsiballZ_systemd_service.py'
Jan 26 14:54:27 compute-1 sudo[170322]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:54:27 compute-1 python3.9[170324]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 14:54:27 compute-1 sudo[170322]: pam_unix(sudo:session): session closed for user root
Jan 26 14:54:28 compute-1 sudo[170475]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxvqaukajladdhynmobwaspluglbrmmz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439267.9976141-895-229900969360269/AnsiballZ_systemd_service.py'
Jan 26 14:54:28 compute-1 sudo[170475]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:54:28 compute-1 python3.9[170477]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 14:54:28 compute-1 sudo[170475]: pam_unix(sudo:session): session closed for user root
Jan 26 14:54:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:54:29.001 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 14:54:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:54:29.002 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 14:54:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:54:29.002 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 14:54:29 compute-1 sudo[170630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uryosctqtwvynfybnaiilirelkheygqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439268.7578316-895-80165404469942/AnsiballZ_systemd_service.py'
Jan 26 14:54:29 compute-1 sudo[170630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:54:29 compute-1 python3.9[170632]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 14:54:29 compute-1 sudo[170630]: pam_unix(sudo:session): session closed for user root
Jan 26 14:54:29 compute-1 sudo[170783]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oivqkxkvkgudwencctrxmrcreasosvfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439269.586414-895-79132887869341/AnsiballZ_systemd_service.py'
Jan 26 14:54:29 compute-1 sudo[170783]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:54:30 compute-1 python3.9[170785]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 14:54:30 compute-1 sudo[170783]: pam_unix(sudo:session): session closed for user root
Jan 26 14:54:30 compute-1 sudo[170937]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmvnlwnpzghlrtbimemgdnhjslbhxwej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439270.4184887-895-77702502265781/AnsiballZ_systemd_service.py'
Jan 26 14:54:30 compute-1 sudo[170937]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:54:31 compute-1 python3.9[170939]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 14:54:31 compute-1 sudo[170937]: pam_unix(sudo:session): session closed for user root
Jan 26 14:54:31 compute-1 systemd[1]: virtnodedevd.service: Deactivated successfully.
Jan 26 14:54:31 compute-1 sudo[171091]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mstnuqvagehndlbksdoqzrtppbkpxuuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439271.2862332-895-142548944374804/AnsiballZ_systemd_service.py'
Jan 26 14:54:31 compute-1 sudo[171091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:54:31 compute-1 python3.9[171093]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 14:54:31 compute-1 sudo[171091]: pam_unix(sudo:session): session closed for user root
Jan 26 14:54:32 compute-1 sudo[171244]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lpnmayelvulkutpvtrrdxxavkyzjkarr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439272.0494783-895-221477947893296/AnsiballZ_systemd_service.py'
Jan 26 14:54:32 compute-1 sudo[171244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:54:32 compute-1 sshd-session[170478]: Invalid user test from 185.246.128.170 port 26018
Jan 26 14:54:32 compute-1 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 26 14:54:32 compute-1 python3.9[171246]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 14:54:32 compute-1 sudo[171244]: pam_unix(sudo:session): session closed for user root
Jan 26 14:54:33 compute-1 sudo[171415]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejdetctwvlcrwcsaiomnhhnzjwlydclt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439273.4537408-1013-215874995917031/AnsiballZ_file.py'
Jan 26 14:54:33 compute-1 sudo[171415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:54:33 compute-1 systemd[1]: virtqemud.service: Deactivated successfully.
Jan 26 14:54:33 compute-1 podman[171374]: 2026-01-26 14:54:33.776936314 +0000 UTC m=+0.067538918 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Jan 26 14:54:33 compute-1 podman[171373]: 2026-01-26 14:54:33.821416828 +0000 UTC m=+0.117507881 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 26 14:54:33 compute-1 python3.9[171427]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:54:33 compute-1 sudo[171415]: pam_unix(sudo:session): session closed for user root
Jan 26 14:54:34 compute-1 sudo[171598]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qeqacklpitvcedlelxohsujvmbkapibw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439274.0943828-1013-138740130116937/AnsiballZ_file.py'
Jan 26 14:54:34 compute-1 sudo[171598]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:54:34 compute-1 python3.9[171600]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:54:34 compute-1 sudo[171598]: pam_unix(sudo:session): session closed for user root
Jan 26 14:54:34 compute-1 systemd[1]: virtsecretd.service: Deactivated successfully.
Jan 26 14:54:35 compute-1 sudo[171751]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dpxvljdohgaaxlsfdhhnqegdyklrbqll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439274.7558873-1013-96198805573586/AnsiballZ_file.py'
Jan 26 14:54:35 compute-1 sudo[171751]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:54:35 compute-1 python3.9[171753]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:54:35 compute-1 sudo[171751]: pam_unix(sudo:session): session closed for user root
Jan 26 14:54:35 compute-1 sudo[171903]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbhxmvzvckmlxmlzarlykvqvkthjxvpm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439275.4287763-1013-208320408436102/AnsiballZ_file.py'
Jan 26 14:54:35 compute-1 sudo[171903]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:54:35 compute-1 python3.9[171905]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:54:35 compute-1 sudo[171903]: pam_unix(sudo:session): session closed for user root
Jan 26 14:54:36 compute-1 sudo[172055]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djwoosozqekczqxwuiqnlpqinpkstuez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439276.0626788-1013-22675675260750/AnsiballZ_file.py'
Jan 26 14:54:36 compute-1 sudo[172055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:54:36 compute-1 python3.9[172057]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:54:36 compute-1 sudo[172055]: pam_unix(sudo:session): session closed for user root
Jan 26 14:54:37 compute-1 sudo[172207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxdtijemdffcboqipgqrljadqfjqwndp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439276.7319226-1013-264098595561340/AnsiballZ_file.py'
Jan 26 14:54:37 compute-1 sudo[172207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:54:37 compute-1 python3.9[172209]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:54:37 compute-1 sudo[172207]: pam_unix(sudo:session): session closed for user root
Jan 26 14:54:37 compute-1 sudo[172359]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rylcjzpuptwsajdmbopbyudccpsnpssw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439277.3727555-1013-213775382707326/AnsiballZ_file.py'
Jan 26 14:54:37 compute-1 sudo[172359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:54:37 compute-1 python3.9[172361]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:54:37 compute-1 sudo[172359]: pam_unix(sudo:session): session closed for user root
Jan 26 14:54:38 compute-1 sudo[172511]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-totwfpdtzybfgeibjqajhxqrykmecwcn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439278.1022666-1013-112570113310277/AnsiballZ_file.py'
Jan 26 14:54:38 compute-1 sudo[172511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:54:38 compute-1 python3.9[172513]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:54:38 compute-1 sudo[172511]: pam_unix(sudo:session): session closed for user root
Jan 26 14:54:39 compute-1 sudo[172663]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qjdolluykckltatbxiwezkyawyvyqhtp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439278.781173-1127-254820090857339/AnsiballZ_file.py'
Jan 26 14:54:39 compute-1 sudo[172663]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:54:39 compute-1 python3.9[172665]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:54:39 compute-1 sudo[172663]: pam_unix(sudo:session): session closed for user root
Jan 26 14:54:39 compute-1 sudo[172815]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbmtukllaxjqkdmonjpaepjwarjdephd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439279.4572275-1127-257849559765840/AnsiballZ_file.py'
Jan 26 14:54:39 compute-1 sudo[172815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:54:39 compute-1 python3.9[172817]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:54:39 compute-1 sudo[172815]: pam_unix(sudo:session): session closed for user root
Jan 26 14:54:40 compute-1 sudo[172967]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mimpyoehzfwxwhqvyzpbhuausffwlzya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439280.0631707-1127-69699894517040/AnsiballZ_file.py'
Jan 26 14:54:40 compute-1 sudo[172967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:54:40 compute-1 python3.9[172969]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:54:40 compute-1 sudo[172967]: pam_unix(sudo:session): session closed for user root
Jan 26 14:54:41 compute-1 sshd-session[170478]: error: maximum authentication attempts exceeded for invalid user test from 185.246.128.170 port 26018 ssh2 [preauth]
Jan 26 14:54:41 compute-1 sshd-session[170478]: Disconnecting invalid user test 185.246.128.170 port 26018: Too many authentication failures [preauth]
Jan 26 14:54:41 compute-1 sudo[173119]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcldtmlhfshuxobhxycsdfcjiydpvmme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439280.8716445-1127-122070218507518/AnsiballZ_file.py'
Jan 26 14:54:41 compute-1 sudo[173119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:54:41 compute-1 python3.9[173121]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:54:41 compute-1 sudo[173119]: pam_unix(sudo:session): session closed for user root
Jan 26 14:54:41 compute-1 sudo[173271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqktzuwtwcwlntlfqlitinjzeiuwhacx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439281.5090275-1127-6168594148574/AnsiballZ_file.py'
Jan 26 14:54:41 compute-1 sudo[173271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:54:41 compute-1 python3.9[173273]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:54:42 compute-1 sudo[173271]: pam_unix(sudo:session): session closed for user root
Jan 26 14:54:42 compute-1 sudo[173424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctoftqqbzggrgrcyyjxrqmzgfsarnwxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439282.177764-1127-145535644966232/AnsiballZ_file.py'
Jan 26 14:54:42 compute-1 sudo[173424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:54:42 compute-1 python3.9[173426]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:54:42 compute-1 sudo[173424]: pam_unix(sudo:session): session closed for user root
Jan 26 14:54:43 compute-1 sudo[173576]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vstwbbewfjrxcdrxzjcbwpluctsmctwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439282.799962-1127-123352925341833/AnsiballZ_file.py'
Jan 26 14:54:43 compute-1 sudo[173576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:54:43 compute-1 python3.9[173578]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:54:43 compute-1 sudo[173576]: pam_unix(sudo:session): session closed for user root
Jan 26 14:54:43 compute-1 sudo[173729]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmmnwfecjbtmclymgorrojzqvepolmex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439283.433446-1127-68951932343670/AnsiballZ_file.py'
Jan 26 14:54:43 compute-1 sudo[173729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:54:44 compute-1 python3.9[173731]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:54:44 compute-1 sudo[173729]: pam_unix(sudo:session): session closed for user root
Jan 26 14:54:44 compute-1 sudo[173881]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfqepptlxkfhapnhtsnszfawesqlzbtb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439284.5978055-1243-135486569648918/AnsiballZ_command.py'
Jan 26 14:54:44 compute-1 sudo[173881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:54:45 compute-1 python3.9[173883]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:54:45 compute-1 sudo[173881]: pam_unix(sudo:session): session closed for user root
Jan 26 14:54:46 compute-1 python3.9[174035]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 26 14:54:46 compute-1 sudo[174185]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpxxljaiugohkbsnmuzbmmixsrganhvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439286.4215336-1279-186576864470206/AnsiballZ_systemd_service.py'
Jan 26 14:54:46 compute-1 sudo[174185]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:54:47 compute-1 python3.9[174187]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 14:54:47 compute-1 systemd[1]: Reloading.
Jan 26 14:54:47 compute-1 systemd-rc-local-generator[174211]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:54:47 compute-1 systemd-sysv-generator[174217]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 14:54:47 compute-1 sudo[174185]: pam_unix(sudo:session): session closed for user root
Jan 26 14:54:47 compute-1 sudo[174374]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilzgqfzubygbsissbwdenydswrpqrzbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439287.631194-1295-125126682551542/AnsiballZ_command.py'
Jan 26 14:54:47 compute-1 sudo[174374]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:54:48 compute-1 python3.9[174376]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:54:48 compute-1 sudo[174374]: pam_unix(sudo:session): session closed for user root
Jan 26 14:54:48 compute-1 sudo[174527]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hoobndmqehfvnvynfgntsdxlunlkfumf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439288.3636687-1295-188752539983604/AnsiballZ_command.py'
Jan 26 14:54:48 compute-1 sudo[174527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:54:48 compute-1 python3.9[174529]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:54:48 compute-1 sudo[174527]: pam_unix(sudo:session): session closed for user root
Jan 26 14:54:49 compute-1 sudo[174680]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fomyenvhimzadfkerbrnkcqrsbnyrjmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439289.0347466-1295-11688692709218/AnsiballZ_command.py'
Jan 26 14:54:49 compute-1 sudo[174680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:54:49 compute-1 python3.9[174682]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:54:49 compute-1 sudo[174680]: pam_unix(sudo:session): session closed for user root
Jan 26 14:54:50 compute-1 sudo[174833]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrksvzntagcdyuvkyjqaaajmfvbesgcy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439289.6665242-1295-83488879331591/AnsiballZ_command.py'
Jan 26 14:54:50 compute-1 sudo[174833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:54:50 compute-1 python3.9[174835]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:54:50 compute-1 sudo[174833]: pam_unix(sudo:session): session closed for user root
Jan 26 14:54:50 compute-1 sudo[174986]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwjtvharjkuhxijoafwvgbgqxbqvicou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439290.5097582-1295-55994258787960/AnsiballZ_command.py'
Jan 26 14:54:50 compute-1 sudo[174986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:54:50 compute-1 python3.9[174988]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:54:51 compute-1 sudo[174986]: pam_unix(sudo:session): session closed for user root
Jan 26 14:54:51 compute-1 sudo[175139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vnnxztvxldlougixtrrnvugqfrgliclt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439291.1471574-1295-72591373913568/AnsiballZ_command.py'
Jan 26 14:54:51 compute-1 sudo[175139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:54:51 compute-1 python3.9[175141]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:54:51 compute-1 sudo[175139]: pam_unix(sudo:session): session closed for user root
Jan 26 14:54:52 compute-1 sudo[175292]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otgmxaswulgwuacfjexvaqeynyjakgkk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439291.8129017-1295-54420784000523/AnsiballZ_command.py'
Jan 26 14:54:52 compute-1 sudo[175292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:54:52 compute-1 python3.9[175294]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:54:52 compute-1 sudo[175292]: pam_unix(sudo:session): session closed for user root
Jan 26 14:54:52 compute-1 sshd-session[173297]: Invalid user test from 185.246.128.170 port 59457
Jan 26 14:54:52 compute-1 sudo[175445]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwrmdaoivjxjfehomkirtdaydscrecul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439292.5567338-1295-143848805535977/AnsiballZ_command.py'
Jan 26 14:54:52 compute-1 sudo[175445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:54:53 compute-1 python3.9[175447]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:54:53 compute-1 sudo[175445]: pam_unix(sudo:session): session closed for user root
Jan 26 14:54:55 compute-1 sshd-session[173297]: Disconnecting invalid user test 185.246.128.170 port 59457: Change of username or service not allowed: (test,ssh-connection) -> (ahmed,ssh-connection) [preauth]
Jan 26 14:54:55 compute-1 sudo[175598]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihlqvhmytkzhronwpenlefykqbdcwsqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439295.052636-1438-216068678598234/AnsiballZ_file.py'
Jan 26 14:54:55 compute-1 sudo[175598]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:54:55 compute-1 python3.9[175600]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:54:55 compute-1 sudo[175598]: pam_unix(sudo:session): session closed for user root
Jan 26 14:54:56 compute-1 sudo[175750]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbgejksmyillsfgzvpyzfmugakyckwgl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439295.7959967-1438-239222712645687/AnsiballZ_file.py'
Jan 26 14:54:56 compute-1 sudo[175750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:54:56 compute-1 python3.9[175752]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:54:56 compute-1 sudo[175750]: pam_unix(sudo:session): session closed for user root
Jan 26 14:54:56 compute-1 sudo[175902]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqikgggysepviqaqhzpgnifepqsrvzyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439296.4905584-1438-148816960195472/AnsiballZ_file.py'
Jan 26 14:54:56 compute-1 sudo[175902]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:54:56 compute-1 python3.9[175904]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:54:57 compute-1 sudo[175902]: pam_unix(sudo:session): session closed for user root
Jan 26 14:54:57 compute-1 sudo[176054]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-caafukhxpmoidylbdylwonbllamsnulw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439297.2404618-1483-118865544343790/AnsiballZ_file.py'
Jan 26 14:54:57 compute-1 sudo[176054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:54:57 compute-1 python3.9[176056]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:54:57 compute-1 sudo[176054]: pam_unix(sudo:session): session closed for user root
Jan 26 14:54:58 compute-1 sudo[176206]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-giuklyouqgvlsrzhtajgajayfkdpalwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439297.8485732-1483-193067364604622/AnsiballZ_file.py'
Jan 26 14:54:58 compute-1 sudo[176206]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:54:58 compute-1 python3.9[176208]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:54:58 compute-1 sudo[176206]: pam_unix(sudo:session): session closed for user root
Jan 26 14:54:58 compute-1 sudo[176358]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-biylnwlihdymhdizjzpjcfxtgjnkicux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439298.5462556-1483-166518020827255/AnsiballZ_file.py'
Jan 26 14:54:58 compute-1 sudo[176358]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:54:59 compute-1 python3.9[176360]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:54:59 compute-1 sudo[176358]: pam_unix(sudo:session): session closed for user root
Jan 26 14:54:59 compute-1 sudo[176510]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpuwddnjkyemkmtkkkfpuiifqldrjacz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439299.1613662-1483-250286894319201/AnsiballZ_file.py'
Jan 26 14:54:59 compute-1 sudo[176510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:54:59 compute-1 python3.9[176512]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:54:59 compute-1 sudo[176510]: pam_unix(sudo:session): session closed for user root
Jan 26 14:55:00 compute-1 sudo[176662]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqqtwywktueriscdjtjbxsqzqcgckthj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439299.7726655-1483-204914073782085/AnsiballZ_file.py'
Jan 26 14:55:00 compute-1 sudo[176662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:55:00 compute-1 python3.9[176664]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:55:00 compute-1 sudo[176662]: pam_unix(sudo:session): session closed for user root
Jan 26 14:55:00 compute-1 sudo[176815]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izzmzqeckywjxmgfutefgchqqfhtyjuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439300.389405-1483-158712967639645/AnsiballZ_file.py'
Jan 26 14:55:00 compute-1 sudo[176815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:55:00 compute-1 python3.9[176817]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:55:00 compute-1 sudo[176815]: pam_unix(sudo:session): session closed for user root
Jan 26 14:55:01 compute-1 sudo[176967]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqufhxblmcsrzxzngxppnvuxcuxirhje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439301.1163764-1483-9151109409867/AnsiballZ_file.py'
Jan 26 14:55:01 compute-1 sudo[176967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:55:01 compute-1 python3.9[176969]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:55:01 compute-1 sudo[176967]: pam_unix(sudo:session): session closed for user root
Jan 26 14:55:03 compute-1 podman[176995]: 2026-01-26 14:55:03.909077034 +0000 UTC m=+0.088099455 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 26 14:55:04 compute-1 podman[177015]: 2026-01-26 14:55:04.000082156 +0000 UTC m=+0.092458202 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 26 14:55:06 compute-1 sshd-session[176665]: Invalid user ahmed from 185.246.128.170 port 48678
Jan 26 14:55:06 compute-1 sudo[177166]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpgakxorfjvxfzdswnxghtswyetkxttu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439306.262119-1719-256010817459637/AnsiballZ_getent.py'
Jan 26 14:55:06 compute-1 sudo[177166]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:55:06 compute-1 python3.9[177168]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Jan 26 14:55:06 compute-1 sudo[177166]: pam_unix(sudo:session): session closed for user root
Jan 26 14:55:07 compute-1 sudo[177319]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-clvdkpkgzklyyadglvojulegtkajzxmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439307.1444213-1735-232434959424816/AnsiballZ_group.py'
Jan 26 14:55:07 compute-1 sudo[177319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:55:07 compute-1 sshd-session[176665]: Disconnecting invalid user ahmed 185.246.128.170 port 48678: Change of username or service not allowed: (ahmed,ssh-connection) -> (support,ssh-connection) [preauth]
Jan 26 14:55:07 compute-1 python3.9[177321]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 26 14:55:08 compute-1 groupadd[177322]: group added to /etc/group: name=nova, GID=42436
Jan 26 14:55:08 compute-1 groupadd[177322]: group added to /etc/gshadow: name=nova
Jan 26 14:55:08 compute-1 groupadd[177322]: new group: name=nova, GID=42436
Jan 26 14:55:08 compute-1 sudo[177319]: pam_unix(sudo:session): session closed for user root
Jan 26 14:55:09 compute-1 sudo[177477]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjlhilyiseiikevhixsvaunafptovdth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439309.0697293-1751-274463719177283/AnsiballZ_user.py'
Jan 26 14:55:09 compute-1 sudo[177477]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:55:09 compute-1 python3.9[177479]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 26 14:55:10 compute-1 useradd[177481]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/0
Jan 26 14:55:10 compute-1 useradd[177481]: add 'nova' to group 'libvirt'
Jan 26 14:55:10 compute-1 useradd[177481]: add 'nova' to shadow group 'libvirt'
Jan 26 14:55:10 compute-1 sudo[177477]: pam_unix(sudo:session): session closed for user root
Jan 26 14:55:11 compute-1 sshd-session[177512]: Accepted publickey for zuul from 192.168.122.30 port 36106 ssh2: ECDSA SHA256:jdvz2lyr5EyIYu+bvBKw53Euf9HIejup115smDZmKeU
Jan 26 14:55:11 compute-1 systemd-logind[795]: New session 26 of user zuul.
Jan 26 14:55:11 compute-1 systemd[1]: Started Session 26 of User zuul.
Jan 26 14:55:11 compute-1 sshd-session[177512]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 14:55:12 compute-1 sshd-session[177515]: Received disconnect from 192.168.122.30 port 36106:11: disconnected by user
Jan 26 14:55:12 compute-1 sshd-session[177515]: Disconnected from user zuul 192.168.122.30 port 36106
Jan 26 14:55:12 compute-1 sshd-session[177512]: pam_unix(sshd:session): session closed for user zuul
Jan 26 14:55:12 compute-1 systemd[1]: session-26.scope: Deactivated successfully.
Jan 26 14:55:12 compute-1 systemd-logind[795]: Session 26 logged out. Waiting for processes to exit.
Jan 26 14:55:12 compute-1 systemd-logind[795]: Removed session 26.
Jan 26 14:55:12 compute-1 python3.9[177665]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:55:13 compute-1 python3.9[177787]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769439312.3338337-1801-177391736309843/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:55:14 compute-1 python3.9[177937]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:55:14 compute-1 python3.9[178013]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:55:15 compute-1 python3.9[178164]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:55:15 compute-1 python3.9[178285]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769439314.767629-1801-80782878516815/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:55:16 compute-1 python3.9[178435]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:55:16 compute-1 sshd-session[177714]: Invalid user support from 185.246.128.170 port 59632
Jan 26 14:55:16 compute-1 python3.9[178556]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769439315.84185-1801-34511927557621/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=bc7f3bb7d4094c596a18178a888511b54e157ba4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:55:17 compute-1 python3.9[178706]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:55:18 compute-1 python3.9[178827]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769439317.004281-1801-79690315176976/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:55:18 compute-1 python3.9[178977]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:55:19 compute-1 python3.9[179098]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769439318.2498293-1801-199274596564861/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:55:20 compute-1 sudo[179248]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqdprcibmkmkuphyvlgjvzohzdzivvmc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439320.1135542-1967-46826324153135/AnsiballZ_file.py'
Jan 26 14:55:20 compute-1 sudo[179248]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:55:20 compute-1 python3.9[179250]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:55:20 compute-1 sudo[179248]: pam_unix(sudo:session): session closed for user root
Jan 26 14:55:21 compute-1 sudo[179400]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilvdjvpxjbzojtodpwjzlsgwwetmuowo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439320.8351398-1983-67078786338585/AnsiballZ_copy.py'
Jan 26 14:55:21 compute-1 sudo[179400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:55:21 compute-1 sshd-session[177714]: Disconnecting invalid user support 185.246.128.170 port 59632: Change of username or service not allowed: (support,ssh-connection) -> (prueba,ssh-connection) [preauth]
Jan 26 14:55:21 compute-1 python3.9[179402]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:55:21 compute-1 sudo[179400]: pam_unix(sudo:session): session closed for user root
Jan 26 14:55:21 compute-1 sudo[179552]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdwvwjaqynbngdmatgsxgbbyifgnrwjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439321.5179374-1999-254705795325632/AnsiballZ_stat.py'
Jan 26 14:55:21 compute-1 sudo[179552]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:55:22 compute-1 python3.9[179554]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 14:55:22 compute-1 sudo[179552]: pam_unix(sudo:session): session closed for user root
Jan 26 14:55:22 compute-1 sudo[179704]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugsyquzcqtotcwofoplplauagvrorpbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439322.208768-2015-206183252590158/AnsiballZ_stat.py'
Jan 26 14:55:22 compute-1 sudo[179704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:55:22 compute-1 python3.9[179706]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:55:22 compute-1 sudo[179704]: pam_unix(sudo:session): session closed for user root
Jan 26 14:55:23 compute-1 sudo[179827]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpziotehxjbajbyuonwypeuvnuximxgi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439322.208768-2015-206183252590158/AnsiballZ_copy.py'
Jan 26 14:55:23 compute-1 sudo[179827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:55:23 compute-1 python3.9[179829]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1769439322.208768-2015-206183252590158/.source _original_basename=.2duf8nlb follow=False checksum=a6c7e299a3e8421db7e950fddf4fb986236d3891 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Jan 26 14:55:23 compute-1 sudo[179827]: pam_unix(sudo:session): session closed for user root
Jan 26 14:55:24 compute-1 python3.9[179981]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 14:55:24 compute-1 python3.9[180133]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:55:25 compute-1 python3.9[180254]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769439324.4056282-2067-59601074256428/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=d1772ffbd86569d7daf2bc5945417d43fa6aba40 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:55:26 compute-1 python3.9[180404]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:55:26 compute-1 python3.9[180525]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769439325.5838606-2098-114588617638454/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=571a6af5e937ae0097dff34c48c4726d65ae4222 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:55:27 compute-1 sudo[180675]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwoayejcimbacwyxgmkxyhkoynajoytl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439326.9789531-2131-34740733868927/AnsiballZ_container_config_data.py'
Jan 26 14:55:27 compute-1 sudo[180675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:55:27 compute-1 python3.9[180677]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Jan 26 14:55:27 compute-1 sudo[180675]: pam_unix(sudo:session): session closed for user root
Jan 26 14:55:28 compute-1 sudo[180829]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibbsevfursiypgfzqqysegnwwrimsywc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439328.0551727-2153-156480366707545/AnsiballZ_container_config_hash.py'
Jan 26 14:55:28 compute-1 sudo[180829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:55:28 compute-1 python3.9[180831]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 26 14:55:28 compute-1 sudo[180829]: pam_unix(sudo:session): session closed for user root
Jan 26 14:55:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:55:29.004 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 14:55:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:55:29.005 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 14:55:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:55:29.005 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 14:55:29 compute-1 sudo[180982]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vnpqwlmmxlbsfxtzibjoyrcpqskzqtkn ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769439329.103881-2173-56961738836603/AnsiballZ_edpm_container_manage.py'
Jan 26 14:55:29 compute-1 sudo[180982]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:55:29 compute-1 python3[180984]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 26 14:55:30 compute-1 podman[181019]: 2026-01-26 14:55:30.68151217 +0000 UTC m=+0.104766410 container create cde6bb7be400c0cb38b2d2adb37eb0bfa05712382eb5b4fd99bbf371133cb075 (image=38.102.83.230:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20260120, config_id=edpm, container_name=nova_compute_init, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=watcher_latest, config_data={'image': '38.102.83.230:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']})
Jan 26 14:55:30 compute-1 podman[181019]: 2026-01-26 14:55:30.61563182 +0000 UTC m=+0.038886080 image pull dcf510f4656465f698906cac740f99e5970bfc138793d2c5abda6beb4ca068f0 38.102.83.230:5001/podified-master-centos10/openstack-nova-compute:watcher_latest
Jan 26 14:55:30 compute-1 python3[180984]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': '38.102.83.230:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z 38.102.83.230:5001/podified-master-centos10/openstack-nova-compute:watcher_latest bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Jan 26 14:55:30 compute-1 sudo[180982]: pam_unix(sudo:session): session closed for user root
Jan 26 14:55:31 compute-1 sudo[181207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yaxtggpdnlwkngfathoorxlyupevpnrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439330.9995582-2189-11214270412114/AnsiballZ_stat.py'
Jan 26 14:55:31 compute-1 sudo[181207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:55:31 compute-1 sshd-session[180702]: Invalid user prueba from 185.246.128.170 port 61906
Jan 26 14:55:31 compute-1 python3.9[181209]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 14:55:31 compute-1 sudo[181207]: pam_unix(sudo:session): session closed for user root
Jan 26 14:55:32 compute-1 sudo[181361]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdnknbvhjcusnlbgtwyuuricxzapwwar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439331.9871397-2213-67337886906189/AnsiballZ_container_config_data.py'
Jan 26 14:55:32 compute-1 sudo[181361]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:55:32 compute-1 python3.9[181363]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Jan 26 14:55:32 compute-1 sudo[181361]: pam_unix(sudo:session): session closed for user root
Jan 26 14:55:33 compute-1 sudo[181513]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zalnhvmjmbdvlbtadsegtepxyhkifnuh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439332.8906665-2235-118543855932418/AnsiballZ_container_config_hash.py'
Jan 26 14:55:33 compute-1 sudo[181513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:55:33 compute-1 python3.9[181515]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 26 14:55:33 compute-1 sudo[181513]: pam_unix(sudo:session): session closed for user root
Jan 26 14:55:34 compute-1 podman[181616]: 2026-01-26 14:55:34.931661673 +0000 UTC m=+0.098626076 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 26 14:55:34 compute-1 podman[181615]: 2026-01-26 14:55:34.939258617 +0000 UTC m=+0.100604990 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, container_name=ovn_controller, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 26 14:55:34 compute-1 sudo[181711]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkxxhtkfrdqhvexrblutklxxtjrqguar ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769439334.6541438-2255-64808778361002/AnsiballZ_edpm_container_manage.py'
Jan 26 14:55:34 compute-1 sudo[181711]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:55:35 compute-1 python3[181715]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 26 14:55:35 compute-1 podman[181751]: 2026-01-26 14:55:35.383784044 +0000 UTC m=+0.023888839 image pull dcf510f4656465f698906cac740f99e5970bfc138793d2c5abda6beb4ca068f0 38.102.83.230:5001/podified-master-centos10/openstack-nova-compute:watcher_latest
Jan 26 14:55:35 compute-1 podman[181751]: 2026-01-26 14:55:35.646976087 +0000 UTC m=+0.287080872 container create 866567b0a71fb5531b526bf9199580f7634f76f1430c78638008f95666284c08 (image=38.102.83.230:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_data={'image': '38.102.83.230:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.license=GPLv2, container_name=nova_compute, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260120)
Jan 26 14:55:35 compute-1 python3[181715]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': '38.102.83.230:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath --volume /etc/multipath.conf:/etc/multipath.conf:ro,Z --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro 38.102.83.230:5001/podified-master-centos10/openstack-nova-compute:watcher_latest kolla_start
Jan 26 14:55:35 compute-1 sshd-session[180702]: Disconnecting invalid user prueba 185.246.128.170 port 61906: Change of username or service not allowed: (prueba,ssh-connection) -> (charles,ssh-connection) [preauth]
Jan 26 14:55:35 compute-1 sudo[181711]: pam_unix(sudo:session): session closed for user root
Jan 26 14:55:36 compute-1 sudo[181940]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsxjwcdvtckyyjnkalsjlsjqtpavndai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439336.0188298-2271-262970985465449/AnsiballZ_stat.py'
Jan 26 14:55:36 compute-1 sudo[181940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:55:36 compute-1 python3.9[181942]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 14:55:36 compute-1 sudo[181940]: pam_unix(sudo:session): session closed for user root
Jan 26 14:55:37 compute-1 sudo[182094]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yyyxlsmzezhztlqfygjgntwbmenuapts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439336.7935617-2289-120600831730253/AnsiballZ_file.py'
Jan 26 14:55:37 compute-1 sudo[182094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:55:37 compute-1 python3.9[182096]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:55:37 compute-1 sudo[182094]: pam_unix(sudo:session): session closed for user root
Jan 26 14:55:37 compute-1 sudo[182245]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shfrnosxdssdhlkumugbcywdlknlfkie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439337.4078348-2289-119336462016202/AnsiballZ_copy.py'
Jan 26 14:55:37 compute-1 sudo[182245]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:55:38 compute-1 python3.9[182247]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769439337.4078348-2289-119336462016202/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:55:38 compute-1 sudo[182245]: pam_unix(sudo:session): session closed for user root
Jan 26 14:55:38 compute-1 sudo[182321]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlkjialsxbxyenrinwrpgrculgvffwqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439337.4078348-2289-119336462016202/AnsiballZ_systemd.py'
Jan 26 14:55:38 compute-1 sudo[182321]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:55:38 compute-1 python3.9[182323]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 14:55:38 compute-1 systemd[1]: Reloading.
Jan 26 14:55:38 compute-1 systemd-rc-local-generator[182352]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:55:38 compute-1 systemd-sysv-generator[182356]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 14:55:38 compute-1 sudo[182321]: pam_unix(sudo:session): session closed for user root
Jan 26 14:55:39 compute-1 sudo[182433]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbcnrdmakrajjjvznhkxbelawtbeyyqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439337.4078348-2289-119336462016202/AnsiballZ_systemd.py'
Jan 26 14:55:39 compute-1 sudo[182433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:55:39 compute-1 python3.9[182435]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 14:55:39 compute-1 systemd[1]: Reloading.
Jan 26 14:55:39 compute-1 systemd-rc-local-generator[182463]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:55:39 compute-1 systemd-sysv-generator[182467]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 14:55:40 compute-1 systemd[1]: Starting nova_compute container...
Jan 26 14:55:41 compute-1 systemd[1]: Started libcrun container.
Jan 26 14:55:41 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/be34c0e6124fb376aadd33880c07eec3a86e988969b52391b1b9b442d382aeb5/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 26 14:55:41 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/be34c0e6124fb376aadd33880c07eec3a86e988969b52391b1b9b442d382aeb5/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 26 14:55:41 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/be34c0e6124fb376aadd33880c07eec3a86e988969b52391b1b9b442d382aeb5/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 26 14:55:41 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/be34c0e6124fb376aadd33880c07eec3a86e988969b52391b1b9b442d382aeb5/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 26 14:55:41 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/be34c0e6124fb376aadd33880c07eec3a86e988969b52391b1b9b442d382aeb5/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 26 14:55:41 compute-1 podman[182476]: 2026-01-26 14:55:41.908880305 +0000 UTC m=+1.176807876 container init 866567b0a71fb5531b526bf9199580f7634f76f1430c78638008f95666284c08 (image=38.102.83.230:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=nova_compute, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'image': '38.102.83.230:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Jan 26 14:55:41 compute-1 podman[182476]: 2026-01-26 14:55:41.914783012 +0000 UTC m=+1.182710563 container start 866567b0a71fb5531b526bf9199580f7634f76f1430c78638008f95666284c08 (image=38.102.83.230:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, tcib_managed=true, container_name=nova_compute, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'image': '38.102.83.230:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Jan 26 14:55:41 compute-1 nova_compute[182490]: + sudo -E kolla_set_configs
Jan 26 14:55:42 compute-1 nova_compute[182490]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 26 14:55:42 compute-1 nova_compute[182490]: INFO:__main__:Validating config file
Jan 26 14:55:42 compute-1 nova_compute[182490]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 26 14:55:42 compute-1 nova_compute[182490]: INFO:__main__:Copying service configuration files
Jan 26 14:55:42 compute-1 nova_compute[182490]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 26 14:55:42 compute-1 nova_compute[182490]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 26 14:55:42 compute-1 nova_compute[182490]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 26 14:55:42 compute-1 nova_compute[182490]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 26 14:55:42 compute-1 nova_compute[182490]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 26 14:55:42 compute-1 nova_compute[182490]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 26 14:55:42 compute-1 nova_compute[182490]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 26 14:55:42 compute-1 nova_compute[182490]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 26 14:55:42 compute-1 nova_compute[182490]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 26 14:55:42 compute-1 nova_compute[182490]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 26 14:55:42 compute-1 nova_compute[182490]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 26 14:55:42 compute-1 nova_compute[182490]: INFO:__main__:Deleting /etc/ceph
Jan 26 14:55:42 compute-1 nova_compute[182490]: INFO:__main__:Creating directory /etc/ceph
Jan 26 14:55:42 compute-1 nova_compute[182490]: INFO:__main__:Setting permission for /etc/ceph
Jan 26 14:55:42 compute-1 nova_compute[182490]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 26 14:55:42 compute-1 nova_compute[182490]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 26 14:55:42 compute-1 nova_compute[182490]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 26 14:55:42 compute-1 nova_compute[182490]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 26 14:55:42 compute-1 nova_compute[182490]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 26 14:55:42 compute-1 nova_compute[182490]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 26 14:55:42 compute-1 nova_compute[182490]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 26 14:55:42 compute-1 nova_compute[182490]: INFO:__main__:Writing out command to execute
Jan 26 14:55:42 compute-1 nova_compute[182490]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 26 14:55:42 compute-1 nova_compute[182490]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 26 14:55:42 compute-1 nova_compute[182490]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 26 14:55:42 compute-1 nova_compute[182490]: ++ cat /run_command
Jan 26 14:55:42 compute-1 nova_compute[182490]: + CMD=nova-compute
Jan 26 14:55:42 compute-1 nova_compute[182490]: + ARGS=
Jan 26 14:55:42 compute-1 nova_compute[182490]: + sudo kolla_copy_cacerts
Jan 26 14:55:42 compute-1 nova_compute[182490]: + [[ ! -n '' ]]
Jan 26 14:55:42 compute-1 nova_compute[182490]: + . kolla_extend_start
Jan 26 14:55:42 compute-1 nova_compute[182490]: Running command: 'nova-compute'
Jan 26 14:55:42 compute-1 nova_compute[182490]: + echo 'Running command: '\''nova-compute'\'''
Jan 26 14:55:42 compute-1 nova_compute[182490]: + umask 0022
Jan 26 14:55:42 compute-1 nova_compute[182490]: + exec nova-compute
Jan 26 14:55:42 compute-1 podman[182476]: nova_compute
Jan 26 14:55:42 compute-1 systemd[1]: Started nova_compute container.
Jan 26 14:55:42 compute-1 sudo[182433]: pam_unix(sudo:session): session closed for user root
Jan 26 14:55:43 compute-1 python3.9[182652]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 14:55:44 compute-1 nova_compute[182490]: 2026-01-26 14:55:44.145 182495 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Jan 26 14:55:44 compute-1 nova_compute[182490]: 2026-01-26 14:55:44.146 182495 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Jan 26 14:55:44 compute-1 nova_compute[182490]: 2026-01-26 14:55:44.146 182495 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Jan 26 14:55:44 compute-1 nova_compute[182490]: 2026-01-26 14:55:44.146 182495 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Jan 26 14:55:44 compute-1 python3.9[182804]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 14:55:44 compute-1 nova_compute[182490]: 2026-01-26 14:55:44.267 182495 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 14:55:44 compute-1 nova_compute[182490]: 2026-01-26 14:55:44.294 182495 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.028s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 14:55:44 compute-1 nova_compute[182490]: 2026-01-26 14:55:44.295 182495 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:423
Jan 26 14:55:44 compute-1 nova_compute[182490]: 2026-01-26 14:55:44.333 182495 INFO oslo_service.periodic_task [-] Skipping periodic task _heal_instance_info_cache because its interval is negative
Jan 26 14:55:44 compute-1 nova_compute[182490]: 2026-01-26 14:55:44.335 182495 WARNING oslo_config.cfg [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] Deprecated: Option "heartbeat_in_pthread" from group "oslo_messaging_rabbit" is deprecated for removal (The option is related to Eventlet which will be removed. In addition this has never worked as expected with services using eventlet for core service framework.).  Its value may be silently ignored in the future.
Jan 26 14:55:45 compute-1 python3.9[182955]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 14:55:45 compute-1 nova_compute[182490]: 2026-01-26 14:55:45.358 182495 INFO nova.virt.driver [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Jan 26 14:55:45 compute-1 nova_compute[182490]: 2026-01-26 14:55:45.471 182495 INFO nova.compute.provider_config [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Jan 26 14:55:45 compute-1 sudo[183107]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-domxhjgwafpxmxnlpohiciurzhsiwotm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439345.3644254-2409-49684519908038/AnsiballZ_podman_container.py'
Jan 26 14:55:45 compute-1 sudo[183107]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:55:46 compute-1 python3.9[183109]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.188 182495 DEBUG oslo_concurrency.lockutils [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.189 182495 DEBUG oslo_concurrency.lockutils [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 14:55:46 compute-1 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.189 182495 DEBUG oslo_concurrency.lockutils [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 14:55:46 compute-1 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.190 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/service.py:274
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.191 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2804
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.191 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2805
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.191 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2806
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.192 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2807
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.192 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2809
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.193 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.193 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.193 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.194 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.194 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.194 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.195 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cell_worker_thread_pool_size   = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.195 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.195 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.195 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.196 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.196 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.196 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.197 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.197 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.197 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.198 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.198 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.198 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.199 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.199 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.199 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.200 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] default_green_pool_size        = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.200 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.201 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.201 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] default_thread_pool_size       = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.201 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.202 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.202 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] fatal_deprecations             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.202 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.203 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.203 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.203 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.204 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] heal_instance_info_cache_interval = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.204 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.204 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.205 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.205 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.205 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] injected_network_template      = /usr/lib/python3.12/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.206 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.206 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.206 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.207 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.207 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.207 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.208 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.208 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.208 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.209 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] key                            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.209 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.209 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] log_color                      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.210 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.210 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.210 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.210 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.211 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.211 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.211 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.211 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.212 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.212 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.212 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.213 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.213 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.213 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.214 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.214 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.214 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.215 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.215 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.215 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.215 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.216 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.216 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.216 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.216 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.217 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] my_shared_fs_storage_ip        = 192.168.122.101 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.217 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.217 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.218 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.218 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.218 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.218 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.219 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.219 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.219 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.219 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] pybasedir                      = /usr/lib/python3.12/site-packages log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.220 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.220 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.220 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.220 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.221 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.221 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.221 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] record                         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.221 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.222 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.222 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.222 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.222 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.223 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.223 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.223 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.224 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.224 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.224 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.224 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.225 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.225 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.225 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.225 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.226 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.226 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.226 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.226 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] shell_completion               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.227 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.227 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.227 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.227 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.228 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.228 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.228 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.228 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.229 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.229 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.229 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] thread_pool_statistic_period   = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.229 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.230 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.230 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.230 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.230 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.231 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.231 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.231 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.231 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.232 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.232 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.232 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.232 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.233 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.233 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.233 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.233 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.234 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.234 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] os_brick.lock_path             = /var/lib/nova/tmp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.234 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.235 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.235 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.235 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.236 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.236 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.236 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.236 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.237 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.237 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.237 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.238 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.238 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.238 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.239 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.239 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.239 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.240 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.240 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.240 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] api.neutron_default_project_id = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.240 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] api.response_validation        = warn log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.240 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.241 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.241 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.241 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.241 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.241 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.242 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.242 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.242 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.242 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.243 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cache.backend_expiration_time  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.243 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.243 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.243 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.243 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.244 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.244 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.244 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cache.enforce_fips_mode        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.244 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.244 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.245 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.245 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.245 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cache.memcache_password        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.245 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.245 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.246 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.246 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.246 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.246 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.246 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.247 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cache.memcache_username        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.247 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.247 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cache.redis_db                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.247 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cache.redis_password           = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.248 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cache.redis_sentinel_service_name = mymaster log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.248 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cache.redis_sentinels          = ['localhost:26379'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.248 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cache.redis_server             = localhost:6379 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.248 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cache.redis_socket_timeout     = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.248 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cache.redis_username           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.248 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.249 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.249 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.249 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.249 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.249 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.250 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.250 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.250 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.250 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.250 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.250 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.251 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.251 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.251 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.251 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.251 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.252 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.252 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.252 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.252 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.252 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.252 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.253 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.253 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.253 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.253 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.253 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.254 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.254 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.254 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.254 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.254 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.255 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.255 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.255 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] compute.sharing_providers_max_uuids_per_request = 200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.255 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.255 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.256 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.256 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.256 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.256 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.257 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] consoleauth.enforce_session_timeout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.257 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.257 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.257 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.257 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.257 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.258 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.258 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.258 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.258 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.258 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.259 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.259 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.259 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cyborg.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.259 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.259 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.259 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.260 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.260 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.260 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.260 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.260 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.261 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] database.asyncio_connection    = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.261 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] database.asyncio_slave_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.261 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.261 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.261 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.262 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.262 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.262 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.262 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.263 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.263 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.263 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.263 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.263 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 sudo[183107]: pam_unix(sudo:session): session closed for user root
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.264 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.264 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.264 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.264 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.264 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.264 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.265 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.265 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] api_database.asyncio_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.265 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] api_database.asyncio_slave_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.265 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.265 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.266 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.266 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.266 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.266 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.266 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.266 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.267 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.267 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.267 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.267 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.267 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.268 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.268 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.268 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.268 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.268 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.269 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.269 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.269 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.270 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] ephemeral_storage_encryption.default_format = luks log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.270 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.270 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.270 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.271 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.271 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.271 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.271 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.272 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.272 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.272 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.272 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.272 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.272 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.273 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.273 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.273 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.273 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.273 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.274 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.274 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.274 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.274 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.274 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.274 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] glance.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.275 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.275 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.275 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.275 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.275 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.275 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.276 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.276 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.276 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.276 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.276 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] manila.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.276 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] manila.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.276 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] manila.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.276 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] manila.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.277 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] manila.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.277 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] manila.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.277 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] manila.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.277 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] manila.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.277 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] manila.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.277 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] manila.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.277 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] manila.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.277 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] manila.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.277 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] manila.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.278 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] manila.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.278 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] manila.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.278 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] manila.service_type            = shared-file-system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.278 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] manila.share_apply_policy_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.278 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] manila.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.278 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] manila.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.278 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] manila.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.278 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] manila.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.278 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] manila.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.279 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] manila.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.279 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.279 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.279 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.279 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.279 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.280 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.280 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.280 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.280 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.280 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.280 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.280 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.280 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.281 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.281 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.281 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] ironic.conductor_group         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.281 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.281 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.281 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.281 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.281 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.281 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.282 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.282 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.282 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.282 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] ironic.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.282 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.282 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.282 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.282 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] ironic.shard                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.282 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.283 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.283 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.283 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.283 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.283 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.283 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.283 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.284 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.284 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.284 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.284 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.284 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.284 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.284 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.284 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.284 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.285 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.285 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.285 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.285 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.285 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.285 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.285 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.285 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.286 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.286 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.286 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.286 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.286 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.286 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.286 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.286 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.286 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.287 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vault.approle_role_id          = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.287 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vault.approle_secret_id        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.287 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.287 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vault.kv_path                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.287 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.287 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.287 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vault.root_token_id            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.287 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.288 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vault.timeout                  = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.288 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.288 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.288 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.288 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.288 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.288 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.288 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.289 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.289 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.289 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.289 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.289 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.289 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.289 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] keystone.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.289 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.290 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.290 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.290 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.290 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.290 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.290 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.290 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.290 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.ceph_mount_options     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.291 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.ceph_mount_point_base  = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.291 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.291 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.291 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.291 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.291 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.291 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.291 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.292 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.292 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.292 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.292 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.292 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.292 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.292 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.292 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.293 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.293 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.293 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.293 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.293 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.293 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.293 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.293 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.294 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.294 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.294 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.294 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.294 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.294 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.294 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.295 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.295 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.295 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.295 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.295 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.295 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.295 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.295 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.296 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.296 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.296 182495 WARNING oslo_config.cfg [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 26 14:55:46 compute-1 nova_compute[182490]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 26 14:55:46 compute-1 nova_compute[182490]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 26 14:55:46 compute-1 nova_compute[182490]: and ``live_migration_inbound_addr`` respectively.
Jan 26 14:55:46 compute-1 nova_compute[182490]: ).  Its value may be silently ignored in the future.
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.296 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.296 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.296 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.296 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.297 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.migration_inbound_addr = 192.168.122.101 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.297 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.297 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.297 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.297 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.297 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.297 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.297 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.298 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.298 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.298 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.298 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.298 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.298 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.298 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.298 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.298 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.299 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.299 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.299 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.299 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.299 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.299 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.299 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.299 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.300 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.300 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.300 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.300 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.300 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.300 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.300 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.301 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.301 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.301 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.tb_cache_size          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.301 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.301 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.301 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.301 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.302 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.302 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.302 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.volume_enforce_multipath = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.302 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.302 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.302 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.302 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.302 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.302 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.303 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.303 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.303 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.303 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.303 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.303 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.303 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.303 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.304 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.304 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.304 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.304 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.304 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.304 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.304 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.304 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.305 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.305 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.305 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.305 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.305 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.305 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.305 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] neutron.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.305 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.305 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.306 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.306 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.306 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.306 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.306 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.306 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.306 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.307 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.307 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.307 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] notifications.include_share_mapping = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.307 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] notifications.notification_format = both log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.307 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.307 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.308 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.308 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.308 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.308 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.308 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.308 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.308 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.308 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.309 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.309 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.309 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.309 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.309 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.309 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.309 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.309 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.309 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.310 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.310 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.310 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.310 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.310 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.310 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.310 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.310 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.311 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.311 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] placement.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.311 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.311 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.311 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.311 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.311 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.312 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.312 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.312 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.312 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.312 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.312 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.312 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.313 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.313 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.313 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.313 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.313 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.313 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.313 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.313 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.314 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.314 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.314 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.314 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.314 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.314 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.314 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.314 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] quota.unified_limits_resource_list = ['servers'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.314 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] quota.unified_limits_resource_strategy = require log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.315 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.315 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.315 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.315 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.315 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.315 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.315 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.315 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.316 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.316 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.316 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.316 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.316 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.316 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.316 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.317 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.317 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.317 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.317 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.317 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] filter_scheduler.hypervisor_version_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.317 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.317 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] filter_scheduler.image_props_weight_multiplier = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.318 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] filter_scheduler.image_props_weight_setting = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.318 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.318 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.318 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.318 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.318 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.318 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] filter_scheduler.num_instances_weight_multiplier = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.318 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.319 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.319 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.319 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.319 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.319 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.319 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.319 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.319 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.319 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.320 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.320 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.320 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.320 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.320 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.320 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.320 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.321 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.321 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.321 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.321 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.321 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.321 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.321 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.321 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.322 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.322 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.322 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.322 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.322 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.322 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.323 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.323 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.323 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.323 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.323 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.323 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.324 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] spice.require_secure           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.324 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.324 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.324 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] spice.spice_direct_proxy_base_url = http://127.0.0.1:13002/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.324 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.324 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.324 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.325 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.325 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.325 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.325 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.325 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.325 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.325 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.325 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.325 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.326 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.326 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.326 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.326 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.326 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.326 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.326 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.326 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.327 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.327 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.327 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.327 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.327 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.327 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.327 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.327 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.327 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.328 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.328 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.328 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.328 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.328 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.328 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.328 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.328 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.329 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.329 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.329 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.329 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.329 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.329 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.329 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.330 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.330 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.330 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.330 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.330 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.331 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.331 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.331 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.331 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.331 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.331 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.331 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.331 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.332 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.332 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.332 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.332 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.332 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.332 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.332 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.333 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.333 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.333 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.333 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.333 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.333 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.333 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.333 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.334 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.334 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.334 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.334 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.334 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.334 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.334 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.335 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.335 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.335 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.335 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.335 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.335 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.335 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.335 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.336 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.336 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_messaging_rabbit.hostname = compute-1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.336 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.336 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.336 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.336 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.336 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_splay = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.336 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_messaging_rabbit.processname = nova-compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.336 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.337 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.337 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.337 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.337 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.337 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.337 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.337 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.337 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.338 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.338 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_messaging_rabbit.rabbit_stream_fanout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.338 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.338 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_messaging_rabbit.rabbit_transient_quorum_queue = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.338 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.338 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.338 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.338 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.339 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.339 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.339 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.339 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_messaging_rabbit.use_queue_manager = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.339 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.339 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.339 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.339 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.340 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.340 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.340 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.340 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.340 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.340 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.340 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.340 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.341 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.341 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.341 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.341 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.341 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.341 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_limit.endpoint_interface  = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.341 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.341 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_limit.endpoint_region_name = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.341 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_limit.endpoint_service_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.342 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_limit.endpoint_service_type = compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.342 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.342 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.342 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.342 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.342 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.342 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.342 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.343 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.343 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.343 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.343 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_limit.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.343 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.343 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.343 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.343 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.344 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.344 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.344 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.344 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.344 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.344 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.344 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.344 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.344 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.345 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.345 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.345 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.345 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.345 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.345 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.345 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.345 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vif_plug_linux_bridge_privileged.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.346 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.346 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.346 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.346 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.346 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.346 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.346 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vif_plug_ovs_privileged.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.346 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.346 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.347 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.347 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.347 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.347 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.347 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.347 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.347 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.347 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.348 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.348 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] os_vif_ovs.default_qos_type    = linux-noop log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.348 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.348 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.348 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.348 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.348 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.348 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.349 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] privsep_osbrick.capabilities   = [21, 2] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.349 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.349 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.349 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] privsep_osbrick.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.349 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.349 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.349 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.349 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.350 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.350 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.350 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] nova_sys_admin.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.350 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.350 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.350 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.350 182495 DEBUG oslo_service.backend._eventlet.service [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2828
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.351 182495 INFO nova.service [-] Starting compute node (version 32.1.0-0.20251105112212.710ffbb.el10)
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.890 182495 DEBUG nova.virt.libvirt.host [None req-1680237e-882a-4ef5-8abf-2c31ae50d2e6 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:498
Jan 26 14:55:46 compute-1 systemd[1]: Starting libvirt QEMU daemon...
Jan 26 14:55:46 compute-1 sudo[183301]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubjvlggghnxvpddsarverunyclrzvugk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439346.6043873-2425-154106102774817/AnsiballZ_systemd.py'
Jan 26 14:55:46 compute-1 sudo[183301]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:55:46 compute-1 systemd[1]: Started libvirt QEMU daemon.
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.962 182495 DEBUG nova.virt.libvirt.host [None req-1680237e-882a-4ef5-8abf-2c31ae50d2e6 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fa5ca8c5070> _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:504
Jan 26 14:55:46 compute-1 nova_compute[182490]: libvirt:  error : internal error: could not initialize domain event timer
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.964 182495 WARNING nova.virt.libvirt.host [None req-1680237e-882a-4ef5-8abf-2c31ae50d2e6 - - - - - -] URI qemu:///system does not support events: internal error: could not initialize domain event timer: libvirt.libvirtError: internal error: could not initialize domain event timer
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.964 182495 DEBUG nova.virt.libvirt.host [None req-1680237e-882a-4ef5-8abf-2c31ae50d2e6 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fa5ca8c5070> _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:525
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.966 182495 DEBUG nova.virt.libvirt.host [None req-1680237e-882a-4ef5-8abf-2c31ae50d2e6 - - - - - -] Starting native event thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:484
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.966 182495 DEBUG nova.virt.libvirt.host [None req-1680237e-882a-4ef5-8abf-2c31ae50d2e6 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:490
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.967 182495 INFO nova.utils [None req-1680237e-882a-4ef5-8abf-2c31ae50d2e6 - - - - - -] The default thread pool MainProcess.default is initialized
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.967 182495 DEBUG nova.virt.libvirt.host [None req-1680237e-882a-4ef5-8abf-2c31ae50d2e6 - - - - - -] Starting connection event dispatch thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:493
Jan 26 14:55:46 compute-1 nova_compute[182490]: 2026-01-26 14:55:46.967 182495 INFO nova.virt.libvirt.driver [None req-1680237e-882a-4ef5-8abf-2c31ae50d2e6 - - - - - -] Connection event '1' reason 'None'
Jan 26 14:55:47 compute-1 python3.9[183313]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 14:55:47 compute-1 systemd[1]: Stopping nova_compute container...
Jan 26 14:55:47 compute-1 sshd-session[182436]: Invalid user charles from 185.246.128.170 port 32343
Jan 26 14:55:47 compute-1 nova_compute[182490]: 2026-01-26 14:55:47.474 182495 WARNING nova.virt.libvirt.driver [None req-1680237e-882a-4ef5-8abf-2c31ae50d2e6 - - - - - -] Cannot update service status on host "compute-1.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.
Jan 26 14:55:47 compute-1 nova_compute[182490]: 2026-01-26 14:55:47.475 182495 DEBUG nova.virt.libvirt.volume.mount [None req-1680237e-882a-4ef5-8abf-2c31ae50d2e6 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/mount.py:130
Jan 26 14:55:47 compute-1 sshd-session[182436]: Disconnecting invalid user charles 185.246.128.170 port 32343: Change of username or service not allowed: (charles,ssh-connection) -> (hduser,ssh-connection) [preauth]
Jan 26 14:55:47 compute-1 nova_compute[182490]: 2026-01-26 14:55:47.525 182495 DEBUG oslo_concurrency.lockutils [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 14:55:47 compute-1 nova_compute[182490]: 2026-01-26 14:55:47.525 182495 DEBUG oslo_concurrency.lockutils [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 14:55:47 compute-1 nova_compute[182490]: 2026-01-26 14:55:47.525 182495 DEBUG oslo_concurrency.lockutils [None req-f6e7b945-08ca-4fc4-8d32-18ad04fa78a6 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 14:55:48 compute-1 virtqemud[183290]: libvirt version: 11.10.0, package: 2.el9 (builder@centos.org, 2025-12-18-15:09:54, )
Jan 26 14:55:48 compute-1 virtqemud[183290]: hostname: compute-1
Jan 26 14:55:48 compute-1 virtqemud[183290]: End of file while reading data: Input/output error
Jan 26 14:55:48 compute-1 systemd[1]: libpod-866567b0a71fb5531b526bf9199580f7634f76f1430c78638008f95666284c08.scope: Deactivated successfully.
Jan 26 14:55:48 compute-1 systemd[1]: libpod-866567b0a71fb5531b526bf9199580f7634f76f1430c78638008f95666284c08.scope: Consumed 3.257s CPU time.
Jan 26 14:55:48 compute-1 podman[183338]: 2026-01-26 14:55:48.396534741 +0000 UTC m=+1.114696336 container died 866567b0a71fb5531b526bf9199580f7634f76f1430c78638008f95666284c08 (image=38.102.83.230:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, container_name=nova_compute, config_data={'image': '38.102.83.230:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260120)
Jan 26 14:55:48 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-866567b0a71fb5531b526bf9199580f7634f76f1430c78638008f95666284c08-userdata-shm.mount: Deactivated successfully.
Jan 26 14:55:48 compute-1 systemd[1]: var-lib-containers-storage-overlay-be34c0e6124fb376aadd33880c07eec3a86e988969b52391b1b9b442d382aeb5-merged.mount: Deactivated successfully.
Jan 26 14:55:48 compute-1 podman[183338]: 2026-01-26 14:55:48.846293865 +0000 UTC m=+1.564455470 container cleanup 866567b0a71fb5531b526bf9199580f7634f76f1430c78638008f95666284c08 (image=38.102.83.230:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, container_name=nova_compute, org.label-schema.schema-version=1.0, config_data={'image': '38.102.83.230:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.4)
Jan 26 14:55:48 compute-1 podman[183338]: nova_compute
Jan 26 14:55:48 compute-1 podman[183376]: nova_compute
Jan 26 14:55:48 compute-1 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Jan 26 14:55:48 compute-1 systemd[1]: Stopped nova_compute container.
Jan 26 14:55:48 compute-1 systemd[1]: Starting nova_compute container...
Jan 26 14:55:49 compute-1 systemd[1]: Started libcrun container.
Jan 26 14:55:49 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/be34c0e6124fb376aadd33880c07eec3a86e988969b52391b1b9b442d382aeb5/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 26 14:55:49 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/be34c0e6124fb376aadd33880c07eec3a86e988969b52391b1b9b442d382aeb5/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 26 14:55:49 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/be34c0e6124fb376aadd33880c07eec3a86e988969b52391b1b9b442d382aeb5/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 26 14:55:49 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/be34c0e6124fb376aadd33880c07eec3a86e988969b52391b1b9b442d382aeb5/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 26 14:55:49 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/be34c0e6124fb376aadd33880c07eec3a86e988969b52391b1b9b442d382aeb5/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 26 14:55:49 compute-1 podman[183388]: 2026-01-26 14:55:49.262491358 +0000 UTC m=+0.317360995 container init 866567b0a71fb5531b526bf9199580f7634f76f1430c78638008f95666284c08 (image=38.102.83.230:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'image': '38.102.83.230:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260120)
Jan 26 14:55:49 compute-1 podman[183388]: 2026-01-26 14:55:49.272527958 +0000 UTC m=+0.327397575 container start 866567b0a71fb5531b526bf9199580f7634f76f1430c78638008f95666284c08 (image=38.102.83.230:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=nova_compute, org.label-schema.build-date=20260120, config_data={'image': '38.102.83.230:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4)
Jan 26 14:55:49 compute-1 nova_compute[183403]: + sudo -E kolla_set_configs
Jan 26 14:55:49 compute-1 nova_compute[183403]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 26 14:55:49 compute-1 nova_compute[183403]: INFO:__main__:Validating config file
Jan 26 14:55:49 compute-1 nova_compute[183403]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 26 14:55:49 compute-1 nova_compute[183403]: INFO:__main__:Copying service configuration files
Jan 26 14:55:49 compute-1 nova_compute[183403]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 26 14:55:49 compute-1 nova_compute[183403]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 26 14:55:49 compute-1 nova_compute[183403]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 26 14:55:49 compute-1 nova_compute[183403]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Jan 26 14:55:49 compute-1 nova_compute[183403]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 26 14:55:49 compute-1 nova_compute[183403]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 26 14:55:49 compute-1 nova_compute[183403]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 26 14:55:49 compute-1 nova_compute[183403]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 26 14:55:49 compute-1 nova_compute[183403]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 26 14:55:49 compute-1 nova_compute[183403]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Jan 26 14:55:49 compute-1 nova_compute[183403]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 26 14:55:49 compute-1 nova_compute[183403]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 26 14:55:49 compute-1 nova_compute[183403]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 26 14:55:49 compute-1 nova_compute[183403]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 26 14:55:49 compute-1 nova_compute[183403]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 26 14:55:49 compute-1 nova_compute[183403]: INFO:__main__:Deleting /etc/ceph
Jan 26 14:55:49 compute-1 nova_compute[183403]: INFO:__main__:Creating directory /etc/ceph
Jan 26 14:55:49 compute-1 nova_compute[183403]: INFO:__main__:Setting permission for /etc/ceph
Jan 26 14:55:49 compute-1 nova_compute[183403]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Jan 26 14:55:49 compute-1 nova_compute[183403]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 26 14:55:49 compute-1 nova_compute[183403]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 26 14:55:49 compute-1 nova_compute[183403]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Jan 26 14:55:49 compute-1 nova_compute[183403]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 26 14:55:49 compute-1 nova_compute[183403]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 26 14:55:49 compute-1 nova_compute[183403]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 26 14:55:49 compute-1 nova_compute[183403]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 26 14:55:49 compute-1 nova_compute[183403]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 26 14:55:49 compute-1 nova_compute[183403]: INFO:__main__:Writing out command to execute
Jan 26 14:55:49 compute-1 nova_compute[183403]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 26 14:55:49 compute-1 nova_compute[183403]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 26 14:55:49 compute-1 nova_compute[183403]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 26 14:55:49 compute-1 nova_compute[183403]: ++ cat /run_command
Jan 26 14:55:49 compute-1 nova_compute[183403]: + CMD=nova-compute
Jan 26 14:55:49 compute-1 nova_compute[183403]: + ARGS=
Jan 26 14:55:49 compute-1 nova_compute[183403]: + sudo kolla_copy_cacerts
Jan 26 14:55:49 compute-1 nova_compute[183403]: + [[ ! -n '' ]]
Jan 26 14:55:49 compute-1 nova_compute[183403]: + . kolla_extend_start
Jan 26 14:55:49 compute-1 nova_compute[183403]: Running command: 'nova-compute'
Jan 26 14:55:49 compute-1 nova_compute[183403]: + echo 'Running command: '\''nova-compute'\'''
Jan 26 14:55:49 compute-1 nova_compute[183403]: + umask 0022
Jan 26 14:55:49 compute-1 nova_compute[183403]: + exec nova-compute
Jan 26 14:55:49 compute-1 podman[183388]: nova_compute
Jan 26 14:55:49 compute-1 systemd[1]: Started nova_compute container.
Jan 26 14:55:49 compute-1 sudo[183301]: pam_unix(sudo:session): session closed for user root
Jan 26 14:55:50 compute-1 sudo[183564]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjfekaywojcvwsouunufjstfdcroslis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439349.784231-2443-97161909076568/AnsiballZ_podman_container.py'
Jan 26 14:55:50 compute-1 sudo[183564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:55:50 compute-1 python3.9[183566]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 26 14:55:50 compute-1 systemd[1]: Started libpod-conmon-cde6bb7be400c0cb38b2d2adb37eb0bfa05712382eb5b4fd99bbf371133cb075.scope.
Jan 26 14:55:50 compute-1 systemd[1]: Started libcrun container.
Jan 26 14:55:50 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a7c6599e75daa877062e217b75b06e39b7b6d4fd4f1e3acb0e64133061b1fe18/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Jan 26 14:55:50 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a7c6599e75daa877062e217b75b06e39b7b6d4fd4f1e3acb0e64133061b1fe18/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 26 14:55:50 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a7c6599e75daa877062e217b75b06e39b7b6d4fd4f1e3acb0e64133061b1fe18/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Jan 26 14:55:50 compute-1 podman[183592]: 2026-01-26 14:55:50.779955174 +0000 UTC m=+0.386710391 container init cde6bb7be400c0cb38b2d2adb37eb0bfa05712382eb5b4fd99bbf371133cb075 (image=38.102.83.230:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, config_id=edpm, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=nova_compute_init, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': '38.102.83.230:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']})
Jan 26 14:55:50 compute-1 podman[183592]: 2026-01-26 14:55:50.792890601 +0000 UTC m=+0.399645818 container start cde6bb7be400c0cb38b2d2adb37eb0bfa05712382eb5b4fd99bbf371133cb075 (image=38.102.83.230:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'image': '38.102.83.230:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, tcib_build_tag=watcher_latest, container_name=nova_compute_init, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 26 14:55:50 compute-1 python3.9[183566]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Jan 26 14:55:50 compute-1 nova_compute_init[183613]: INFO:nova_statedir:Applying nova statedir ownership
Jan 26 14:55:50 compute-1 nova_compute_init[183613]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Jan 26 14:55:50 compute-1 nova_compute_init[183613]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Jan 26 14:55:50 compute-1 nova_compute_init[183613]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Jan 26 14:55:50 compute-1 nova_compute_init[183613]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Jan 26 14:55:50 compute-1 nova_compute_init[183613]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Jan 26 14:55:50 compute-1 nova_compute_init[183613]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Jan 26 14:55:50 compute-1 nova_compute_init[183613]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Jan 26 14:55:50 compute-1 nova_compute_init[183613]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Jan 26 14:55:50 compute-1 nova_compute_init[183613]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Jan 26 14:55:50 compute-1 nova_compute_init[183613]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Jan 26 14:55:50 compute-1 nova_compute_init[183613]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Jan 26 14:55:50 compute-1 nova_compute_init[183613]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Jan 26 14:55:50 compute-1 nova_compute_init[183613]: INFO:nova_statedir:Nova statedir ownership complete
Jan 26 14:55:50 compute-1 systemd[1]: libpod-cde6bb7be400c0cb38b2d2adb37eb0bfa05712382eb5b4fd99bbf371133cb075.scope: Deactivated successfully.
Jan 26 14:55:50 compute-1 podman[183614]: 2026-01-26 14:55:50.872620275 +0000 UTC m=+0.040785807 container died cde6bb7be400c0cb38b2d2adb37eb0bfa05712382eb5b4fd99bbf371133cb075 (image=38.102.83.230:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, container_name=nova_compute_init, config_data={'image': '38.102.83.230:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2)
Jan 26 14:55:51 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cde6bb7be400c0cb38b2d2adb37eb0bfa05712382eb5b4fd99bbf371133cb075-userdata-shm.mount: Deactivated successfully.
Jan 26 14:55:51 compute-1 systemd[1]: var-lib-containers-storage-overlay-a7c6599e75daa877062e217b75b06e39b7b6d4fd4f1e3acb0e64133061b1fe18-merged.mount: Deactivated successfully.
Jan 26 14:55:51 compute-1 podman[183627]: 2026-01-26 14:55:51.070149937 +0000 UTC m=+0.179101017 container cleanup cde6bb7be400c0cb38b2d2adb37eb0bfa05712382eb5b4fd99bbf371133cb075 (image=38.102.83.230:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, container_name=nova_compute_init, io.buildah.version=1.41.4, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'image': '38.102.83.230:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Jan 26 14:55:51 compute-1 systemd[1]: libpod-conmon-cde6bb7be400c0cb38b2d2adb37eb0bfa05712382eb5b4fd99bbf371133cb075.scope: Deactivated successfully.
Jan 26 14:55:51 compute-1 sudo[183564]: pam_unix(sudo:session): session closed for user root
Jan 26 14:55:51 compute-1 nova_compute[183403]: 2026-01-26 14:55:51.408 183407 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Jan 26 14:55:51 compute-1 nova_compute[183403]: 2026-01-26 14:55:51.408 183407 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Jan 26 14:55:51 compute-1 nova_compute[183403]: 2026-01-26 14:55:51.408 183407 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Jan 26 14:55:51 compute-1 nova_compute[183403]: 2026-01-26 14:55:51.408 183407 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Jan 26 14:55:51 compute-1 nova_compute[183403]: 2026-01-26 14:55:51.531 183407 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 14:55:51 compute-1 nova_compute[183403]: 2026-01-26 14:55:51.543 183407 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.012s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 14:55:51 compute-1 nova_compute[183403]: 2026-01-26 14:55:51.544 183407 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:423
Jan 26 14:55:51 compute-1 nova_compute[183403]: 2026-01-26 14:55:51.575 183407 INFO oslo_service.periodic_task [-] Skipping periodic task _heal_instance_info_cache because its interval is negative
Jan 26 14:55:51 compute-1 nova_compute[183403]: 2026-01-26 14:55:51.576 183407 WARNING oslo_config.cfg [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] Deprecated: Option "heartbeat_in_pthread" from group "oslo_messaging_rabbit" is deprecated for removal (The option is related to Eventlet which will be removed. In addition this has never worked as expected with services using eventlet for core service framework.).  Its value may be silently ignored in the future.
Jan 26 14:55:51 compute-1 sshd-session[160295]: Connection closed by 192.168.122.30 port 39638
Jan 26 14:55:51 compute-1 sshd-session[160292]: pam_unix(sshd:session): session closed for user zuul
Jan 26 14:55:51 compute-1 systemd[1]: session-25.scope: Deactivated successfully.
Jan 26 14:55:51 compute-1 systemd[1]: session-25.scope: Consumed 1min 40.136s CPU time.
Jan 26 14:55:51 compute-1 systemd-logind[795]: Session 25 logged out. Waiting for processes to exit.
Jan 26 14:55:51 compute-1 systemd-logind[795]: Removed session 25.
Jan 26 14:55:52 compute-1 nova_compute[183403]: 2026-01-26 14:55:52.545 183407 INFO nova.virt.driver [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Jan 26 14:55:52 compute-1 nova_compute[183403]: 2026-01-26 14:55:52.636 183407 INFO nova.compute.provider_config [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.192 183407 DEBUG oslo_concurrency.lockutils [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.192 183407 DEBUG oslo_concurrency.lockutils [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.193 183407 DEBUG oslo_concurrency.lockutils [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.193 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/service.py:274
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.194 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2804
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.194 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2805
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.194 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] command line args: [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2806
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.195 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2807
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.195 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2809
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.196 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.196 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.196 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.196 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.197 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.197 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.197 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cell_worker_thread_pool_size   = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.198 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.198 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.198 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.199 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.199 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.199 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.199 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.200 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.200 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.200 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.201 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.201 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] debug                          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.201 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.201 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.202 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.202 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] default_green_pool_size        = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.203 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.203 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.203 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] default_thread_pool_size       = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.203 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.204 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.204 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] fatal_deprecations             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.204 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.205 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.205 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.205 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.206 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] heal_instance_info_cache_interval = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.206 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.206 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.207 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.207 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.207 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] injected_network_template      = /usr/lib/python3.12/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.208 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.208 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.208 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.209 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.209 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.209 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.209 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.210 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.210 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.210 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] key                            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.211 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.211 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] log_color                      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.211 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.211 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.212 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.212 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.212 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.212 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.213 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.213 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.213 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.214 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.214 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.214 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.215 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.215 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.215 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.215 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.216 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.216 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.216 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.217 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.217 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.217 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.217 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.218 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.218 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.218 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] my_shared_fs_storage_ip        = 192.168.122.101 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.219 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.219 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.219 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.220 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.220 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.220 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.221 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.221 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.221 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.221 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] pybasedir                      = /usr/lib/python3.12/site-packages log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.222 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.222 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.222 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.223 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.223 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.223 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.223 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] record                         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.224 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.224 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.224 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.224 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.225 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.225 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.225 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.226 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.226 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.226 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.227 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.227 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.227 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.228 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.228 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.228 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.229 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.229 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.229 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.229 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] shell_completion               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.230 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.230 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.230 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.231 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.231 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.231 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.232 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.232 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.232 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.232 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.233 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] thread_pool_statistic_period   = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.233 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.233 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.234 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.234 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.234 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.234 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.235 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.235 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.235 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.235 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.236 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.236 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.236 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.237 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.237 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.237 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.238 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.238 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.238 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] os_brick.lock_path             = /var/lib/nova/tmp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.238 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.239 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.239 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.239 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.240 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.240 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.240 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.241 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.241 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.241 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.242 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.242 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.242 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.242 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.243 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.243 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.243 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.243 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.243 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.243 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] api.neutron_default_project_id = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.244 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] api.response_validation        = warn log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.244 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.244 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.244 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.244 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.245 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.245 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.245 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.245 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.245 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.245 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.246 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cache.backend_expiration_time  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.246 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.246 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.246 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.246 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.247 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.247 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.247 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cache.enforce_fips_mode        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.247 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.247 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.248 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.248 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.248 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cache.memcache_password        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.248 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.248 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.248 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.249 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.249 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.249 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.249 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.249 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cache.memcache_username        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.249 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.250 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cache.redis_db                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.250 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cache.redis_password           = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.250 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cache.redis_sentinel_service_name = mymaster log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.250 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cache.redis_sentinels          = ['localhost:26379'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.250 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cache.redis_server             = localhost:6379 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.250 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cache.redis_socket_timeout     = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.251 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cache.redis_username           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.251 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.251 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.251 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.251 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.251 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.252 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.252 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.252 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.252 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.252 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.252 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.253 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.253 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.253 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.253 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.253 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.253 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.254 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.254 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.254 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.254 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.254 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.254 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.254 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.255 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.255 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.255 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.255 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.255 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.255 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.256 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.256 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.256 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.256 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.256 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.256 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] compute.sharing_providers_max_uuids_per_request = 200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.257 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.257 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.257 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.257 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.257 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.258 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.258 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] consoleauth.enforce_session_timeout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.258 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.258 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.258 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.258 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.259 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.259 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.259 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.259 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.259 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.260 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.260 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.260 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.260 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cyborg.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.260 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.260 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.260 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.261 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.261 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.261 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.261 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.261 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.262 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] database.asyncio_connection    = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.262 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] database.asyncio_slave_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.262 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.262 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.262 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.263 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.263 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.263 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.263 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.264 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.264 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.264 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.264 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.265 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.265 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.265 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.265 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.266 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.266 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.266 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.266 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.267 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] api_database.asyncio_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.267 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] api_database.asyncio_slave_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.267 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.268 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.268 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.268 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.268 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.269 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.269 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.269 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.269 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.270 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.270 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.270 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.270 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.271 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.271 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.271 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.272 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.272 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.272 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.273 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.273 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.273 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] ephemeral_storage_encryption.default_format = luks log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.273 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.274 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.274 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.274 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.274 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.275 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.275 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.275 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.275 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.276 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.276 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.276 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.277 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.277 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.277 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.278 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.278 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.278 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.279 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.279 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.279 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.279 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.279 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.279 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] glance.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.280 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.280 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.280 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.280 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.280 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.281 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.281 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.281 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.281 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.281 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.282 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] manila.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.282 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] manila.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.282 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] manila.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.282 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] manila.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.282 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] manila.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.283 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] manila.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.283 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] manila.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.283 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] manila.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.283 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] manila.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.283 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] manila.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.284 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] manila.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.284 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] manila.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.284 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] manila.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.284 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] manila.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.284 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] manila.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.285 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] manila.service_type            = shared-file-system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.285 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] manila.share_apply_policy_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.285 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] manila.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.285 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] manila.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.285 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] manila.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.286 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] manila.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.286 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] manila.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.286 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] manila.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.286 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.287 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.287 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.287 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.287 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.287 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.288 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.288 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.288 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.288 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.288 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.289 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.289 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.289 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.289 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.289 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] ironic.conductor_group         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.290 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.290 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.290 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.290 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.290 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.291 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.291 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.291 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.291 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.291 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] ironic.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.291 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.292 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.292 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.292 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] ironic.shard                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.292 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.292 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.293 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.293 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.293 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.293 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.293 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.294 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.294 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.294 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.294 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.294 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.295 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.295 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.295 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.295 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.295 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.296 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.296 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.296 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.296 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.296 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.297 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.297 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.297 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.297 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.297 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.298 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.298 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.298 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.298 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.298 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.298 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.299 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.299 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vault.approle_role_id          = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.299 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vault.approle_secret_id        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.299 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.299 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vault.kv_path                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.300 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.300 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.300 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vault.root_token_id            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.300 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.300 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vault.timeout                  = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.301 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.301 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.301 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.301 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.301 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.302 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.302 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.302 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.302 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.302 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.303 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.303 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.303 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.303 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] keystone.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.303 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.304 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.304 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.304 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.304 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.304 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.305 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.305 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.305 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.ceph_mount_options     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.305 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.ceph_mount_point_base  = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.305 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.306 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.306 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.306 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.306 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.307 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.307 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.307 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.307 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.307 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.308 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.308 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.308 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.308 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.308 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.308 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.309 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.309 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.309 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.309 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.310 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.310 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.310 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.310 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.310 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.310 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.311 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.311 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.311 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.311 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.311 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.312 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.312 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.312 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.312 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.312 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.313 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.313 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.313 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.313 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.314 183407 WARNING oslo_config.cfg [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 26 14:55:53 compute-1 nova_compute[183403]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 26 14:55:53 compute-1 nova_compute[183403]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 26 14:55:53 compute-1 nova_compute[183403]: and ``live_migration_inbound_addr`` respectively.
Jan 26 14:55:53 compute-1 nova_compute[183403]: ).  Its value may be silently ignored in the future.
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.314 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.314 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.314 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.314 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.315 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.migration_inbound_addr = 192.168.122.101 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.315 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.315 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.315 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.315 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.316 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.316 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.316 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.316 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.316 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.317 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.317 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.317 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.317 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.317 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.318 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.318 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.318 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.318 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.318 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.319 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.319 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.319 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.319 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.319 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.320 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.320 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.320 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.320 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.320 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.321 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.321 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.321 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.321 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.322 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.tb_cache_size          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.322 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.322 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.322 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.322 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.323 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.323 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.323 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.volume_enforce_multipath = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.323 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.324 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.324 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.324 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.324 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.324 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.325 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.325 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.325 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.325 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.326 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.326 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.326 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.326 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.326 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.326 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.327 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.327 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.327 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.327 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.327 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.327 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.327 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.328 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.328 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.328 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.328 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.328 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.328 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] neutron.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.328 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.328 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.329 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.329 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.329 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.329 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.329 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.329 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.329 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.329 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.330 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.330 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] notifications.include_share_mapping = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.330 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] notifications.notification_format = both log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.330 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.330 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.330 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.330 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.330 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.331 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.331 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.331 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.331 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.331 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.331 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.331 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.331 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.332 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.332 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.332 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.332 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.332 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.332 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.332 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.332 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.332 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.333 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.333 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.333 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.333 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.333 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.333 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.333 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] placement.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.334 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.334 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.334 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.334 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.334 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.334 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.335 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.335 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.335 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.335 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.335 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.335 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.335 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.336 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.336 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.336 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.336 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.336 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.336 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.337 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.337 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.337 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.337 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.337 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.337 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.338 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.338 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.338 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] quota.unified_limits_resource_list = ['servers'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.338 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] quota.unified_limits_resource_strategy = require log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.338 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.338 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.339 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.339 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.339 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.339 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.339 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.339 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.340 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.340 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.340 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.340 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.340 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.340 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.340 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.341 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.341 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.341 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.341 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.341 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] filter_scheduler.hypervisor_version_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.341 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.342 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] filter_scheduler.image_props_weight_multiplier = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.342 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] filter_scheduler.image_props_weight_setting = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.342 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.342 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.342 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.342 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.342 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.342 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] filter_scheduler.num_instances_weight_multiplier = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.343 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.343 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.343 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.343 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.343 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.343 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.343 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.343 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.343 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.344 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.344 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.344 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.344 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.344 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.344 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.344 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.345 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.345 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.345 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.345 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.345 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.345 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.345 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.345 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.346 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.346 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.346 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.346 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.346 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.346 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.346 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.346 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.347 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.347 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.347 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.347 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.347 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.347 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] spice.require_secure           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.347 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.348 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.348 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] spice.spice_direct_proxy_base_url = http://127.0.0.1:13002/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.348 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.348 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.348 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.348 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.348 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.348 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.349 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.349 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.349 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.349 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.349 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.349 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.349 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.349 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.350 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.350 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.350 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.350 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.350 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.350 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.350 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.350 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.351 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.351 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.351 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.351 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.351 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.351 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.351 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.351 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.351 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.352 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.352 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.352 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.352 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.352 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.352 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.352 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.352 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.353 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.353 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.353 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.353 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.353 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.353 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.353 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.354 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.354 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.354 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.354 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.354 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.354 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.354 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.354 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.355 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.355 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.355 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.355 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.355 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.355 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.355 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.355 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.356 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.356 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.356 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.356 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.356 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.356 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.356 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.357 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.357 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.357 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.357 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.357 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.357 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.357 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.357 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.358 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.358 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.358 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.358 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.358 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.358 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.358 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.358 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.359 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.359 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.359 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.359 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_messaging_rabbit.hostname = compute-1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.359 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.359 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.359 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.359 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.359 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_messaging_rabbit.kombu_reconnect_splay = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.360 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_messaging_rabbit.processname = nova-compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.360 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.360 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.360 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.360 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.360 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.360 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.361 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.361 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.361 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.361 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.361 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_messaging_rabbit.rabbit_stream_fanout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.361 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.361 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_messaging_rabbit.rabbit_transient_quorum_queue = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.361 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.362 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.362 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.362 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.362 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.362 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.362 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.362 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_messaging_rabbit.use_queue_manager = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.362 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.363 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.363 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.363 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.363 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.363 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.363 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.363 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.363 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.364 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.364 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.364 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.364 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.364 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.364 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.364 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.364 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.365 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_limit.endpoint_interface  = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.365 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.365 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_limit.endpoint_region_name = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.365 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_limit.endpoint_service_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.365 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_limit.endpoint_service_type = compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.365 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.365 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.366 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.366 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.366 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.366 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.366 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.366 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.366 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.366 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.366 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_limit.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.367 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.367 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.367 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.367 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.367 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.367 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.367 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.367 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.368 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.368 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.368 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.368 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.368 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.368 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.368 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.369 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.369 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.369 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.369 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.369 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.369 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vif_plug_linux_bridge_privileged.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.370 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.370 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.370 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.370 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.370 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.370 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.370 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vif_plug_ovs_privileged.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.371 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.371 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.371 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.371 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.371 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.371 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.371 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.371 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.372 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.372 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.372 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.372 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] os_vif_ovs.default_qos_type    = linux-noop log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.372 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.372 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.373 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.373 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.373 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.373 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.373 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] privsep_osbrick.capabilities   = [21, 2] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.373 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.373 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.373 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] privsep_osbrick.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.374 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.374 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.374 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.374 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.374 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.374 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.374 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] nova_sys_admin.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.375 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.375 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.375 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.375 183407 DEBUG oslo_service.backend._eventlet.service [None req-a8f57a12-a9c2-4cb6-a53d-c45b7060df4b - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2828
Jan 26 14:55:53 compute-1 nova_compute[183403]: 2026-01-26 14:55:53.376 183407 INFO nova.service [-] Starting compute node (version 32.1.0-0.20251105112212.710ffbb.el10)
Jan 26 14:55:54 compute-1 nova_compute[183403]: 2026-01-26 14:55:54.007 183407 DEBUG nova.virt.libvirt.host [None req-5fc79986-1a15-4531-ac8a-3d3bd7f04801 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:498
Jan 26 14:55:54 compute-1 nova_compute[183403]: 2026-01-26 14:55:54.023 183407 DEBUG nova.virt.libvirt.host [None req-5fc79986-1a15-4531-ac8a-3d3bd7f04801 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f3988021670> _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:504
Jan 26 14:55:54 compute-1 nova_compute[183403]: libvirt:  error : internal error: could not initialize domain event timer
Jan 26 14:55:54 compute-1 nova_compute[183403]: 2026-01-26 14:55:54.024 183407 WARNING nova.virt.libvirt.host [None req-5fc79986-1a15-4531-ac8a-3d3bd7f04801 - - - - - -] URI qemu:///system does not support events: internal error: could not initialize domain event timer: libvirt.libvirtError: internal error: could not initialize domain event timer
Jan 26 14:55:54 compute-1 nova_compute[183403]: 2026-01-26 14:55:54.025 183407 DEBUG nova.virt.libvirt.host [None req-5fc79986-1a15-4531-ac8a-3d3bd7f04801 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f3988021670> _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:525
Jan 26 14:55:54 compute-1 nova_compute[183403]: 2026-01-26 14:55:54.026 183407 DEBUG nova.virt.libvirt.host [None req-5fc79986-1a15-4531-ac8a-3d3bd7f04801 - - - - - -] Starting native event thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:484
Jan 26 14:55:54 compute-1 nova_compute[183403]: 2026-01-26 14:55:54.027 183407 DEBUG nova.virt.libvirt.host [None req-5fc79986-1a15-4531-ac8a-3d3bd7f04801 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:490
Jan 26 14:55:54 compute-1 nova_compute[183403]: 2026-01-26 14:55:54.027 183407 INFO nova.utils [None req-5fc79986-1a15-4531-ac8a-3d3bd7f04801 - - - - - -] The default thread pool MainProcess.default is initialized
Jan 26 14:55:54 compute-1 nova_compute[183403]: 2026-01-26 14:55:54.027 183407 DEBUG nova.virt.libvirt.host [None req-5fc79986-1a15-4531-ac8a-3d3bd7f04801 - - - - - -] Starting connection event dispatch thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:493
Jan 26 14:55:54 compute-1 nova_compute[183403]: 2026-01-26 14:55:54.028 183407 INFO nova.virt.libvirt.driver [None req-5fc79986-1a15-4531-ac8a-3d3bd7f04801 - - - - - -] Connection event '1' reason 'None'
Jan 26 14:55:54 compute-1 nova_compute[183403]: 2026-01-26 14:55:54.035 183407 INFO nova.virt.libvirt.host [None req-5fc79986-1a15-4531-ac8a-3d3bd7f04801 - - - - - -] Libvirt host capabilities <capabilities>
Jan 26 14:55:54 compute-1 nova_compute[183403]: 
Jan 26 14:55:54 compute-1 nova_compute[183403]:   <host>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <uuid>79d25091-300b-4c01-ad96-85507a639987</uuid>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <cpu>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <arch>x86_64</arch>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model>EPYC-Rome-v4</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <vendor>AMD</vendor>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <microcode version='16777317'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <signature family='23' model='49' stepping='0'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <maxphysaddr mode='emulate' bits='40'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature name='x2apic'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature name='tsc-deadline'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature name='osxsave'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature name='hypervisor'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature name='tsc_adjust'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature name='spec-ctrl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature name='stibp'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature name='arch-capabilities'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature name='ssbd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature name='cmp_legacy'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature name='topoext'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature name='virt-ssbd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature name='lbrv'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature name='tsc-scale'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature name='vmcb-clean'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature name='pause-filter'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature name='pfthreshold'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature name='svme-addr-chk'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature name='rdctl-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature name='skip-l1dfl-vmentry'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature name='mds-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature name='pschange-mc-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <pages unit='KiB' size='4'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <pages unit='KiB' size='2048'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <pages unit='KiB' size='1048576'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </cpu>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <power_management>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <suspend_mem/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <suspend_disk/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <suspend_hybrid/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </power_management>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <iommu support='no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <migration_features>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <live/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <uri_transports>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <uri_transport>tcp</uri_transport>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <uri_transport>rdma</uri_transport>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </uri_transports>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </migration_features>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <topology>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <cells num='1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <cell id='0'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:           <memory unit='KiB'>7864316</memory>
Jan 26 14:55:54 compute-1 nova_compute[183403]:           <pages unit='KiB' size='4'>1966079</pages>
Jan 26 14:55:54 compute-1 nova_compute[183403]:           <pages unit='KiB' size='2048'>0</pages>
Jan 26 14:55:54 compute-1 nova_compute[183403]:           <pages unit='KiB' size='1048576'>0</pages>
Jan 26 14:55:54 compute-1 nova_compute[183403]:           <distances>
Jan 26 14:55:54 compute-1 nova_compute[183403]:             <sibling id='0' value='10'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:           </distances>
Jan 26 14:55:54 compute-1 nova_compute[183403]:           <cpus num='8'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:           </cpus>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         </cell>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </cells>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </topology>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <cache>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </cache>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <secmodel>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model>selinux</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <doi>0</doi>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </secmodel>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <secmodel>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model>dac</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <doi>0</doi>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <baselabel type='kvm'>+107:+107</baselabel>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <baselabel type='qemu'>+107:+107</baselabel>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </secmodel>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   </host>
Jan 26 14:55:54 compute-1 nova_compute[183403]: 
Jan 26 14:55:54 compute-1 nova_compute[183403]:   <guest>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <os_type>hvm</os_type>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <arch name='i686'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <wordsize>32</wordsize>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <domain type='qemu'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <domain type='kvm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </arch>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <features>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <pae/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <nonpae/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <acpi default='on' toggle='yes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <apic default='on' toggle='no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <cpuselection/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <deviceboot/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <disksnapshot default='on' toggle='no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <externalSnapshot/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </features>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   </guest>
Jan 26 14:55:54 compute-1 nova_compute[183403]: 
Jan 26 14:55:54 compute-1 nova_compute[183403]:   <guest>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <os_type>hvm</os_type>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <arch name='x86_64'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <wordsize>64</wordsize>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <domain type='qemu'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <domain type='kvm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </arch>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <features>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <acpi default='on' toggle='yes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <apic default='on' toggle='no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <cpuselection/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <deviceboot/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <disksnapshot default='on' toggle='no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <externalSnapshot/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </features>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   </guest>
Jan 26 14:55:54 compute-1 nova_compute[183403]: 
Jan 26 14:55:54 compute-1 nova_compute[183403]: </capabilities>
Jan 26 14:55:54 compute-1 nova_compute[183403]: 
Jan 26 14:55:54 compute-1 nova_compute[183403]: 2026-01-26 14:55:54.041 183407 DEBUG nova.virt.libvirt.host [None req-5fc79986-1a15-4531-ac8a-3d3bd7f04801 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:944
Jan 26 14:55:54 compute-1 nova_compute[183403]: 2026-01-26 14:55:54.062 183407 DEBUG nova.virt.libvirt.host [None req-5fc79986-1a15-4531-ac8a-3d3bd7f04801 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Jan 26 14:55:54 compute-1 nova_compute[183403]: <domainCapabilities>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   <path>/usr/libexec/qemu-kvm</path>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   <domain>kvm</domain>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   <machine>pc-i440fx-rhel7.6.0</machine>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   <arch>i686</arch>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   <vcpu max='240'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   <iothreads supported='yes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   <os supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <enum name='firmware'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <loader supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='type'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>rom</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>pflash</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='readonly'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>yes</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>no</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='secure'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>no</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </loader>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   </os>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   <cpu>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <mode name='host-passthrough' supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='hostPassthroughMigratable'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>on</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>off</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </mode>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <mode name='maximum' supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='maximumMigratable'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>on</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>off</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </mode>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <mode name='host-model' supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <vendor>AMD</vendor>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='x2apic'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='tsc-deadline'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='hypervisor'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='tsc_adjust'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='spec-ctrl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='stibp'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='ssbd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='cmp_legacy'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='overflow-recov'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='succor'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='ibrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='amd-ssbd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='virt-ssbd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='lbrv'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='tsc-scale'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='vmcb-clean'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='flushbyasid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='pause-filter'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='pfthreshold'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='svme-addr-chk'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='disable' name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </mode>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <mode name='custom' supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Broadwell'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Broadwell-IBRS'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Broadwell-noTSX'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Broadwell-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Broadwell-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Broadwell-v3'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Broadwell-v4'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Cascadelake-Server'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Cascadelake-Server-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Cascadelake-Server-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Cascadelake-Server-v3'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Cascadelake-Server-v4'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Cascadelake-Server-v5'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='ClearwaterForest'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-ne-convert'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni-int16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bhi-ctrl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bhi-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cldemote'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cmpccxadd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ddpd-u'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fbsdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='intel-psfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ipred-ctrl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='lam'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='mcdt-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pbrsb-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='prefetchiti'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='psdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rrsba-ctrl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbdr-ssdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sha512'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sm3'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sm4'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='ClearwaterForest-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-ne-convert'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni-int16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bhi-ctrl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bhi-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cldemote'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cmpccxadd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ddpd-u'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fbsdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='intel-psfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ipred-ctrl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='lam'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='mcdt-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pbrsb-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='prefetchiti'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='psdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rrsba-ctrl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbdr-ssdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sha512'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sm3'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sm4'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Cooperlake'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Cooperlake-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Cooperlake-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Denverton'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='mpx'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Denverton-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='mpx'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Denverton-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Denverton-v3'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Dhyana-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-Genoa'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amd-psfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='auto-ibrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='no-nested-data-bp'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='null-sel-clr-base'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='stibp-always-on'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-Genoa-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amd-psfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='auto-ibrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='no-nested-data-bp'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='null-sel-clr-base'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='stibp-always-on'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-Genoa-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amd-psfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='auto-ibrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fs-gs-base-ns'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='no-nested-data-bp'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='null-sel-clr-base'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='perfmon-v2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='stibp-always-on'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-Milan'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-Milan-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-Milan-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amd-psfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='no-nested-data-bp'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='null-sel-clr-base'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='stibp-always-on'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-Milan-v3'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amd-psfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='no-nested-data-bp'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='null-sel-clr-base'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='stibp-always-on'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-Rome'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-Rome-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-Rome-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-Rome-v3'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-Turin'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amd-psfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='auto-ibrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vp2intersect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fs-gs-base-ns'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibpb-brtype'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='no-nested-data-bp'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='null-sel-clr-base'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='perfmon-v2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='prefetchi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbpb'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='srso-user-kernel-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='stibp-always-on'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-Turin-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amd-psfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='auto-ibrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vp2intersect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fs-gs-base-ns'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibpb-brtype'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='no-nested-data-bp'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='null-sel-clr-base'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='perfmon-v2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='prefetchi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbpb'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='srso-user-kernel-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='stibp-always-on'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-v3'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-v4'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-v5'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='GraniteRapids'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-fp16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-tile'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-fp16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fbsdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrc'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fzrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='mcdt-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pbrsb-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='prefetchiti'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='psdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbdr-ssdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='tsx-ldtrk'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='GraniteRapids-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-fp16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-tile'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-fp16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fbsdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrc'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fzrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='mcdt-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pbrsb-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='prefetchiti'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='psdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbdr-ssdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='tsx-ldtrk'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='GraniteRapids-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-fp16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-tile'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx10'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx10-128'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx10-256'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx10-512'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-fp16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cldemote'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fbsdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrc'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fzrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='mcdt-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pbrsb-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='prefetchiti'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='psdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbdr-ssdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='tsx-ldtrk'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='GraniteRapids-v3'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-fp16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-tile'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx10'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx10-128'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx10-256'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx10-512'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-fp16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cldemote'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fbsdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrc'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fzrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='mcdt-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pbrsb-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='prefetchiti'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='psdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbdr-ssdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='tsx-ldtrk'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Haswell'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Haswell-IBRS'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Haswell-noTSX'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Haswell-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Haswell-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Haswell-v3'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Haswell-v4'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Icelake-Server'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Icelake-Server-noTSX'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Icelake-Server-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Icelake-Server-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Icelake-Server-v3'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Icelake-Server-v4'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Icelake-Server-v5'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Icelake-Server-v6'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Icelake-Server-v7'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='IvyBridge'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='IvyBridge-IBRS'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='IvyBridge-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='IvyBridge-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='KnightsMill'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-4fmaps'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-4vnniw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512er'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512pf'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='KnightsMill-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-4fmaps'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-4vnniw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512er'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512pf'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Opteron_G4'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fma4'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xop'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Opteron_G4-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fma4'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xop'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Opteron_G5'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fma4'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='tbm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xop'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Opteron_G5-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fma4'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='tbm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xop'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='SapphireRapids'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-tile'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-fp16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrc'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fzrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='tsx-ldtrk'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='SapphireRapids-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-tile'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-fp16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrc'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fzrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='tsx-ldtrk'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='SapphireRapids-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-tile'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-fp16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fbsdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrc'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fzrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='psdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbdr-ssdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='tsx-ldtrk'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='SapphireRapids-v3'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-tile'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-fp16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cldemote'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fbsdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrc'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fzrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='psdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbdr-ssdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='tsx-ldtrk'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='SapphireRapids-v4'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-tile'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-fp16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cldemote'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fbsdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrc'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fzrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='psdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbdr-ssdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='tsx-ldtrk'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='SierraForest'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-ne-convert'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cmpccxadd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fbsdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='mcdt-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pbrsb-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='psdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbdr-ssdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='SierraForest-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-ne-convert'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cmpccxadd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fbsdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='mcdt-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pbrsb-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='psdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbdr-ssdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='SierraForest-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-ne-convert'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bhi-ctrl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cldemote'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cmpccxadd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fbsdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='intel-psfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ipred-ctrl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='lam'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='mcdt-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pbrsb-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='psdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rrsba-ctrl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbdr-ssdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='SierraForest-v3'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-ne-convert'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bhi-ctrl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cldemote'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cmpccxadd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fbsdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='intel-psfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ipred-ctrl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='lam'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='mcdt-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pbrsb-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='psdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rrsba-ctrl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbdr-ssdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Client'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Client-IBRS'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Client-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Client-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Client-v3'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Client-v4'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Server'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Server-IBRS'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Server-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Server-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Server-v3'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Server-v4'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Server-v5'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Snowridge'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cldemote'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='core-capability'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='mpx'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='split-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Snowridge-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cldemote'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='core-capability'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='mpx'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='split-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Snowridge-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cldemote'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='core-capability'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='split-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Snowridge-v3'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cldemote'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='core-capability'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='split-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Snowridge-v4'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cldemote'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='athlon'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='3dnow'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='3dnowext'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='athlon-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='3dnow'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='3dnowext'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='core2duo'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='core2duo-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='coreduo'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='coreduo-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='n270'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='n270-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='phenom'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='3dnow'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='3dnowext'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='phenom-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='3dnow'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='3dnowext'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </mode>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   </cpu>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   <memoryBacking supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <enum name='sourceType'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <value>file</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <value>anonymous</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <value>memfd</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   </memoryBacking>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   <devices>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <disk supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='diskDevice'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>disk</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>cdrom</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>floppy</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>lun</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='bus'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>ide</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>fdc</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>scsi</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>virtio</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>usb</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>sata</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='model'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>virtio</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>virtio-transitional</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>virtio-non-transitional</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </disk>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <graphics supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='type'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>vnc</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>egl-headless</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>dbus</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </graphics>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <video supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='modelType'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>vga</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>cirrus</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>virtio</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>none</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>bochs</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>ramfb</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </video>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <hostdev supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='mode'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>subsystem</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='startupPolicy'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>default</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>mandatory</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>requisite</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>optional</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='subsysType'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>usb</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>pci</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>scsi</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='capsType'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='pciBackend'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </hostdev>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <rng supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='model'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>virtio</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>virtio-transitional</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>virtio-non-transitional</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='backendModel'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>random</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>egd</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>builtin</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </rng>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <filesystem supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='driverType'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>path</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>handle</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>virtiofs</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </filesystem>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <tpm supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='model'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>tpm-tis</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>tpm-crb</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='backendModel'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>emulator</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>external</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='backendVersion'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>2.0</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </tpm>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <redirdev supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='bus'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>usb</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </redirdev>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <channel supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='type'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>pty</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>unix</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </channel>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <crypto supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='model'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='type'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>qemu</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='backendModel'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>builtin</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </crypto>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <interface supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='backendType'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>default</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>passt</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </interface>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <panic supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='model'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>isa</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>hyperv</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </panic>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <console supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='type'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>null</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>vc</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>pty</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>dev</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>file</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>pipe</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>stdio</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>udp</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>tcp</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>unix</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>qemu-vdagent</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>dbus</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </console>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   </devices>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   <features>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <gic supported='no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <vmcoreinfo supported='yes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <genid supported='yes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <backingStoreInput supported='yes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <backup supported='yes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <async-teardown supported='yes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <s390-pv supported='no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <ps2 supported='yes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <tdx supported='no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <sev supported='no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <sgx supported='no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <hyperv supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='features'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>relaxed</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>vapic</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>spinlocks</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>vpindex</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>runtime</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>synic</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>stimer</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>reset</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>vendor_id</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>frequencies</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>reenlightenment</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>tlbflush</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>ipi</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>avic</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>emsr_bitmap</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>xmm_input</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <defaults>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <spinlocks>4095</spinlocks>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <stimer_direct>on</stimer_direct>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <tlbflush_direct>on</tlbflush_direct>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <tlbflush_extended>on</tlbflush_extended>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </defaults>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </hyperv>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <launchSecurity supported='no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   </features>
Jan 26 14:55:54 compute-1 nova_compute[183403]: </domainCapabilities>
Jan 26 14:55:54 compute-1 nova_compute[183403]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Jan 26 14:55:54 compute-1 nova_compute[183403]: 2026-01-26 14:55:54.070 183407 DEBUG nova.virt.libvirt.host [None req-5fc79986-1a15-4531-ac8a-3d3bd7f04801 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Jan 26 14:55:54 compute-1 nova_compute[183403]: <domainCapabilities>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   <path>/usr/libexec/qemu-kvm</path>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   <domain>kvm</domain>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   <machine>pc-q35-rhel9.8.0</machine>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   <arch>i686</arch>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   <vcpu max='4096'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   <iothreads supported='yes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   <os supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <enum name='firmware'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <loader supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='type'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>rom</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>pflash</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='readonly'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>yes</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>no</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='secure'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>no</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </loader>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   </os>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   <cpu>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <mode name='host-passthrough' supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='hostPassthroughMigratable'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>on</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>off</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </mode>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <mode name='maximum' supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='maximumMigratable'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>on</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>off</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </mode>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <mode name='host-model' supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <vendor>AMD</vendor>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='x2apic'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='tsc-deadline'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='hypervisor'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='tsc_adjust'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='spec-ctrl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='stibp'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='ssbd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='cmp_legacy'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='overflow-recov'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='succor'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='ibrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='amd-ssbd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='virt-ssbd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='lbrv'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='tsc-scale'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='vmcb-clean'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='flushbyasid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='pause-filter'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='pfthreshold'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='svme-addr-chk'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='disable' name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </mode>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <mode name='custom' supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Broadwell'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Broadwell-IBRS'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Broadwell-noTSX'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Broadwell-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Broadwell-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Broadwell-v3'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Broadwell-v4'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Cascadelake-Server'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Cascadelake-Server-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Cascadelake-Server-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Cascadelake-Server-v3'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Cascadelake-Server-v4'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Cascadelake-Server-v5'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='ClearwaterForest'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-ne-convert'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni-int16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bhi-ctrl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bhi-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cldemote'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cmpccxadd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ddpd-u'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fbsdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='intel-psfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ipred-ctrl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='lam'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='mcdt-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pbrsb-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='prefetchiti'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='psdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rrsba-ctrl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbdr-ssdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sha512'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sm3'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sm4'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='ClearwaterForest-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-ne-convert'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni-int16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bhi-ctrl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bhi-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cldemote'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cmpccxadd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ddpd-u'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fbsdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='intel-psfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ipred-ctrl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='lam'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='mcdt-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pbrsb-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='prefetchiti'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='psdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rrsba-ctrl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbdr-ssdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sha512'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sm3'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sm4'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Cooperlake'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Cooperlake-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Cooperlake-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Denverton'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='mpx'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Denverton-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='mpx'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Denverton-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Denverton-v3'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Dhyana-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-Genoa'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amd-psfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='auto-ibrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='no-nested-data-bp'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='null-sel-clr-base'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='stibp-always-on'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-Genoa-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amd-psfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='auto-ibrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='no-nested-data-bp'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='null-sel-clr-base'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='stibp-always-on'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-Genoa-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amd-psfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='auto-ibrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fs-gs-base-ns'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='no-nested-data-bp'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='null-sel-clr-base'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='perfmon-v2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='stibp-always-on'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-Milan'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-Milan-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-Milan-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amd-psfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='no-nested-data-bp'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='null-sel-clr-base'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='stibp-always-on'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-Milan-v3'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amd-psfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='no-nested-data-bp'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='null-sel-clr-base'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='stibp-always-on'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-Rome'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-Rome-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-Rome-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-Rome-v3'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-Turin'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amd-psfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='auto-ibrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vp2intersect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fs-gs-base-ns'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibpb-brtype'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='no-nested-data-bp'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='null-sel-clr-base'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='perfmon-v2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='prefetchi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbpb'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='srso-user-kernel-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='stibp-always-on'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-Turin-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amd-psfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='auto-ibrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vp2intersect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fs-gs-base-ns'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibpb-brtype'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='no-nested-data-bp'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='null-sel-clr-base'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='perfmon-v2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='prefetchi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbpb'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='srso-user-kernel-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='stibp-always-on'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-v3'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-v4'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-v5'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='GraniteRapids'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-fp16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-tile'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-fp16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fbsdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrc'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fzrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='mcdt-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pbrsb-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='prefetchiti'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='psdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbdr-ssdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='tsx-ldtrk'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='GraniteRapids-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-fp16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-tile'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-fp16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fbsdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrc'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fzrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='mcdt-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pbrsb-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='prefetchiti'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='psdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbdr-ssdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='tsx-ldtrk'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='GraniteRapids-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-fp16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-tile'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx10'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx10-128'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx10-256'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx10-512'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-fp16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cldemote'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fbsdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrc'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fzrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='mcdt-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pbrsb-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='prefetchiti'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='psdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbdr-ssdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='tsx-ldtrk'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='GraniteRapids-v3'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-fp16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-tile'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx10'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx10-128'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx10-256'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx10-512'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-fp16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cldemote'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fbsdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrc'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fzrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='mcdt-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pbrsb-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='prefetchiti'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='psdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbdr-ssdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='tsx-ldtrk'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Haswell'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Haswell-IBRS'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Haswell-noTSX'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Haswell-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Haswell-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Haswell-v3'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Haswell-v4'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Icelake-Server'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Icelake-Server-noTSX'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Icelake-Server-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Icelake-Server-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Icelake-Server-v3'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Icelake-Server-v4'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Icelake-Server-v5'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Icelake-Server-v6'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Icelake-Server-v7'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='IvyBridge'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='IvyBridge-IBRS'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='IvyBridge-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='IvyBridge-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='KnightsMill'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-4fmaps'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-4vnniw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512er'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512pf'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='KnightsMill-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-4fmaps'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-4vnniw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512er'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512pf'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Opteron_G4'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fma4'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xop'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Opteron_G4-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fma4'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xop'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Opteron_G5'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fma4'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='tbm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xop'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Opteron_G5-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fma4'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='tbm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xop'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='SapphireRapids'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-tile'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-fp16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrc'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fzrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='tsx-ldtrk'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='SapphireRapids-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-tile'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-fp16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrc'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fzrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='tsx-ldtrk'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='SapphireRapids-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-tile'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-fp16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fbsdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrc'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fzrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='psdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbdr-ssdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='tsx-ldtrk'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='SapphireRapids-v3'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-tile'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-fp16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cldemote'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fbsdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrc'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fzrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='psdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbdr-ssdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='tsx-ldtrk'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='SapphireRapids-v4'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-tile'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-fp16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cldemote'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fbsdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrc'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fzrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='psdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbdr-ssdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='tsx-ldtrk'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='SierraForest'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-ne-convert'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cmpccxadd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fbsdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='mcdt-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pbrsb-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='psdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbdr-ssdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='SierraForest-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-ne-convert'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cmpccxadd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fbsdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='mcdt-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pbrsb-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='psdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbdr-ssdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='SierraForest-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-ne-convert'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bhi-ctrl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cldemote'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cmpccxadd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fbsdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='intel-psfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ipred-ctrl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='lam'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='mcdt-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pbrsb-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='psdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rrsba-ctrl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbdr-ssdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='SierraForest-v3'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-ne-convert'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bhi-ctrl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cldemote'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cmpccxadd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fbsdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='intel-psfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ipred-ctrl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='lam'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='mcdt-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pbrsb-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='psdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rrsba-ctrl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbdr-ssdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Client'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Client-IBRS'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Client-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Client-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Client-v3'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Client-v4'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Server'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Server-IBRS'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Server-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Server-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Server-v3'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Server-v4'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Server-v5'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Snowridge'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cldemote'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='core-capability'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='mpx'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='split-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Snowridge-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cldemote'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='core-capability'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='mpx'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='split-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Snowridge-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cldemote'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='core-capability'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='split-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Snowridge-v3'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cldemote'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='core-capability'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='split-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Snowridge-v4'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cldemote'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='athlon'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='3dnow'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='3dnowext'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='athlon-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='3dnow'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='3dnowext'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='core2duo'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='core2duo-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='coreduo'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='coreduo-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='n270'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='n270-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='phenom'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='3dnow'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='3dnowext'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='phenom-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='3dnow'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='3dnowext'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </mode>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   </cpu>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   <memoryBacking supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <enum name='sourceType'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <value>file</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <value>anonymous</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <value>memfd</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   </memoryBacking>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   <devices>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <disk supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='diskDevice'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>disk</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>cdrom</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>floppy</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>lun</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='bus'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>fdc</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>scsi</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>virtio</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>usb</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>sata</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='model'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>virtio</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>virtio-transitional</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>virtio-non-transitional</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </disk>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <graphics supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='type'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>vnc</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>egl-headless</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>dbus</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </graphics>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <video supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='modelType'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>vga</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>cirrus</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>virtio</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>none</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>bochs</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>ramfb</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </video>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <hostdev supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='mode'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>subsystem</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='startupPolicy'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>default</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>mandatory</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>requisite</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>optional</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='subsysType'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>usb</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>pci</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>scsi</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='capsType'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='pciBackend'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </hostdev>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <rng supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='model'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>virtio</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>virtio-transitional</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>virtio-non-transitional</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='backendModel'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>random</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>egd</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>builtin</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </rng>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <filesystem supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='driverType'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>path</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>handle</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>virtiofs</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </filesystem>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <tpm supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='model'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>tpm-tis</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>tpm-crb</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='backendModel'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>emulator</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>external</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='backendVersion'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>2.0</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </tpm>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <redirdev supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='bus'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>usb</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </redirdev>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <channel supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='type'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>pty</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>unix</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </channel>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <crypto supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='model'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='type'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>qemu</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='backendModel'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>builtin</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </crypto>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <interface supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='backendType'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>default</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>passt</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </interface>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <panic supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='model'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>isa</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>hyperv</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </panic>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <console supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='type'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>null</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>vc</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>pty</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>dev</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>file</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>pipe</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>stdio</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>udp</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>tcp</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>unix</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>qemu-vdagent</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>dbus</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </console>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   </devices>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   <features>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <gic supported='no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <vmcoreinfo supported='yes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <genid supported='yes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <backingStoreInput supported='yes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <backup supported='yes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <async-teardown supported='yes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <s390-pv supported='no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <ps2 supported='yes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <tdx supported='no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <sev supported='no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <sgx supported='no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <hyperv supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='features'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>relaxed</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>vapic</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>spinlocks</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>vpindex</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>runtime</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>synic</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>stimer</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>reset</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>vendor_id</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>frequencies</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>reenlightenment</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>tlbflush</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>ipi</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>avic</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>emsr_bitmap</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>xmm_input</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <defaults>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <spinlocks>4095</spinlocks>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <stimer_direct>on</stimer_direct>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <tlbflush_direct>on</tlbflush_direct>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <tlbflush_extended>on</tlbflush_extended>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </defaults>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </hyperv>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <launchSecurity supported='no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   </features>
Jan 26 14:55:54 compute-1 nova_compute[183403]: </domainCapabilities>
Jan 26 14:55:54 compute-1 nova_compute[183403]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Jan 26 14:55:54 compute-1 nova_compute[183403]: 2026-01-26 14:55:54.147 183407 DEBUG nova.virt.libvirt.host [None req-5fc79986-1a15-4531-ac8a-3d3bd7f04801 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:944
Jan 26 14:55:54 compute-1 nova_compute[183403]: 2026-01-26 14:55:54.153 183407 DEBUG nova.virt.libvirt.host [None req-5fc79986-1a15-4531-ac8a-3d3bd7f04801 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Jan 26 14:55:54 compute-1 nova_compute[183403]: <domainCapabilities>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   <path>/usr/libexec/qemu-kvm</path>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   <domain>kvm</domain>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   <machine>pc-i440fx-rhel7.6.0</machine>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   <arch>x86_64</arch>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   <vcpu max='240'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   <iothreads supported='yes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   <os supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <enum name='firmware'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <loader supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='type'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>rom</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>pflash</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='readonly'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>yes</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>no</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='secure'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>no</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </loader>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   </os>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   <cpu>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <mode name='host-passthrough' supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='hostPassthroughMigratable'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>on</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>off</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </mode>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <mode name='maximum' supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='maximumMigratable'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>on</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>off</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </mode>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <mode name='host-model' supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <vendor>AMD</vendor>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='x2apic'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='tsc-deadline'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='hypervisor'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='tsc_adjust'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='spec-ctrl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='stibp'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='ssbd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='cmp_legacy'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='overflow-recov'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='succor'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='ibrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='amd-ssbd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='virt-ssbd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='lbrv'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='tsc-scale'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='vmcb-clean'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='flushbyasid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='pause-filter'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='pfthreshold'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='svme-addr-chk'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='disable' name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </mode>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <mode name='custom' supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Broadwell'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Broadwell-IBRS'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Broadwell-noTSX'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Broadwell-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Broadwell-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Broadwell-v3'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Broadwell-v4'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Cascadelake-Server'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Cascadelake-Server-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Cascadelake-Server-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Cascadelake-Server-v3'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Cascadelake-Server-v4'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Cascadelake-Server-v5'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='ClearwaterForest'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-ne-convert'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni-int16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bhi-ctrl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bhi-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cldemote'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cmpccxadd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ddpd-u'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fbsdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='intel-psfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ipred-ctrl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='lam'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='mcdt-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pbrsb-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='prefetchiti'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='psdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rrsba-ctrl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbdr-ssdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sha512'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sm3'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sm4'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='ClearwaterForest-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-ne-convert'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni-int16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bhi-ctrl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bhi-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cldemote'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cmpccxadd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ddpd-u'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fbsdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='intel-psfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ipred-ctrl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='lam'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='mcdt-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pbrsb-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='prefetchiti'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='psdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rrsba-ctrl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbdr-ssdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sha512'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sm3'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sm4'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Cooperlake'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Cooperlake-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Cooperlake-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Denverton'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='mpx'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Denverton-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='mpx'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Denverton-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Denverton-v3'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Dhyana-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-Genoa'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amd-psfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='auto-ibrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='no-nested-data-bp'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='null-sel-clr-base'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='stibp-always-on'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-Genoa-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amd-psfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='auto-ibrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='no-nested-data-bp'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='null-sel-clr-base'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='stibp-always-on'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-Genoa-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amd-psfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='auto-ibrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fs-gs-base-ns'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='no-nested-data-bp'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='null-sel-clr-base'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='perfmon-v2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='stibp-always-on'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-Milan'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-Milan-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-Milan-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amd-psfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='no-nested-data-bp'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='null-sel-clr-base'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='stibp-always-on'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-Milan-v3'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amd-psfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='no-nested-data-bp'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='null-sel-clr-base'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='stibp-always-on'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-Rome'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-Rome-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-Rome-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-Rome-v3'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-Turin'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amd-psfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='auto-ibrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vp2intersect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fs-gs-base-ns'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibpb-brtype'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='no-nested-data-bp'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='null-sel-clr-base'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='perfmon-v2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='prefetchi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbpb'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='srso-user-kernel-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='stibp-always-on'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-Turin-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amd-psfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='auto-ibrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vp2intersect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fs-gs-base-ns'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibpb-brtype'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='no-nested-data-bp'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='null-sel-clr-base'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='perfmon-v2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='prefetchi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbpb'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='srso-user-kernel-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='stibp-always-on'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-v3'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-v4'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-v5'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='GraniteRapids'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-fp16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-tile'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-fp16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fbsdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrc'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fzrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='mcdt-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pbrsb-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='prefetchiti'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='psdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbdr-ssdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='tsx-ldtrk'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='GraniteRapids-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-fp16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-tile'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-fp16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fbsdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrc'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fzrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='mcdt-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pbrsb-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='prefetchiti'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='psdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbdr-ssdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='tsx-ldtrk'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='GraniteRapids-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-fp16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-tile'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx10'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx10-128'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx10-256'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx10-512'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-fp16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cldemote'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fbsdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrc'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fzrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='mcdt-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pbrsb-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='prefetchiti'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='psdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbdr-ssdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='tsx-ldtrk'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='GraniteRapids-v3'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-fp16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-tile'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx10'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx10-128'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx10-256'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx10-512'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-fp16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cldemote'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fbsdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrc'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fzrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='mcdt-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pbrsb-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='prefetchiti'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='psdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbdr-ssdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='tsx-ldtrk'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Haswell'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Haswell-IBRS'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Haswell-noTSX'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Haswell-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Haswell-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Haswell-v3'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Haswell-v4'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Icelake-Server'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Icelake-Server-noTSX'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Icelake-Server-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Icelake-Server-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Icelake-Server-v3'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Icelake-Server-v4'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Icelake-Server-v5'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Icelake-Server-v6'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Icelake-Server-v7'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='IvyBridge'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='IvyBridge-IBRS'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='IvyBridge-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='IvyBridge-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='KnightsMill'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-4fmaps'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-4vnniw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512er'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512pf'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='KnightsMill-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-4fmaps'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-4vnniw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512er'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512pf'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Opteron_G4'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fma4'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xop'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Opteron_G4-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fma4'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xop'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Opteron_G5'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fma4'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='tbm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xop'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Opteron_G5-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fma4'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='tbm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xop'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='SapphireRapids'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-tile'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-fp16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrc'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fzrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='tsx-ldtrk'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='SapphireRapids-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-tile'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-fp16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrc'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fzrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='tsx-ldtrk'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='SapphireRapids-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-tile'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-fp16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fbsdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrc'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fzrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='psdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbdr-ssdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='tsx-ldtrk'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='SapphireRapids-v3'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-tile'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-fp16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cldemote'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fbsdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrc'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fzrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='psdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbdr-ssdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='tsx-ldtrk'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='SapphireRapids-v4'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-tile'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-fp16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cldemote'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fbsdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrc'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fzrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='psdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbdr-ssdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='tsx-ldtrk'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='SierraForest'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-ne-convert'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cmpccxadd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fbsdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='mcdt-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pbrsb-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='psdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbdr-ssdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='SierraForest-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-ne-convert'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cmpccxadd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fbsdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='mcdt-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pbrsb-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='psdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbdr-ssdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='SierraForest-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-ne-convert'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bhi-ctrl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cldemote'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cmpccxadd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fbsdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='intel-psfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ipred-ctrl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='lam'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='mcdt-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pbrsb-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='psdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rrsba-ctrl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbdr-ssdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='SierraForest-v3'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-ne-convert'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bhi-ctrl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cldemote'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cmpccxadd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fbsdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='intel-psfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ipred-ctrl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='lam'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='mcdt-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pbrsb-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='psdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rrsba-ctrl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbdr-ssdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Client'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Client-IBRS'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Client-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Client-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Client-v3'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Client-v4'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Server'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Server-IBRS'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Server-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Server-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Server-v3'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Server-v4'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Server-v5'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Snowridge'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cldemote'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='core-capability'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='mpx'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='split-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Snowridge-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cldemote'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='core-capability'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='mpx'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='split-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Snowridge-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cldemote'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='core-capability'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='split-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Snowridge-v3'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cldemote'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='core-capability'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='split-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Snowridge-v4'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cldemote'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='athlon'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='3dnow'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='3dnowext'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='athlon-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='3dnow'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='3dnowext'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='core2duo'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='core2duo-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='coreduo'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='coreduo-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='n270'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='n270-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='phenom'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='3dnow'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='3dnowext'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='phenom-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='3dnow'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='3dnowext'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </mode>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   </cpu>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   <memoryBacking supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <enum name='sourceType'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <value>file</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <value>anonymous</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <value>memfd</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   </memoryBacking>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   <devices>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <disk supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='diskDevice'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>disk</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>cdrom</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>floppy</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>lun</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='bus'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>ide</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>fdc</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>scsi</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>virtio</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>usb</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>sata</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='model'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>virtio</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>virtio-transitional</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>virtio-non-transitional</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </disk>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <graphics supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='type'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>vnc</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>egl-headless</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>dbus</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </graphics>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <video supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='modelType'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>vga</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>cirrus</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>virtio</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>none</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>bochs</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>ramfb</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </video>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <hostdev supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='mode'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>subsystem</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='startupPolicy'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>default</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>mandatory</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>requisite</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>optional</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='subsysType'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>usb</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>pci</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>scsi</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='capsType'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='pciBackend'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </hostdev>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <rng supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='model'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>virtio</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>virtio-transitional</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>virtio-non-transitional</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='backendModel'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>random</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>egd</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>builtin</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </rng>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <filesystem supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='driverType'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>path</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>handle</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>virtiofs</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </filesystem>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <tpm supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='model'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>tpm-tis</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>tpm-crb</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='backendModel'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>emulator</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>external</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='backendVersion'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>2.0</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </tpm>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <redirdev supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='bus'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>usb</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </redirdev>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <channel supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='type'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>pty</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>unix</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </channel>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <crypto supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='model'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='type'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>qemu</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='backendModel'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>builtin</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </crypto>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <interface supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='backendType'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>default</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>passt</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </interface>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <panic supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='model'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>isa</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>hyperv</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </panic>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <console supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='type'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>null</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>vc</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>pty</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>dev</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>file</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>pipe</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>stdio</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>udp</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>tcp</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>unix</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>qemu-vdagent</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>dbus</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </console>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   </devices>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   <features>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <gic supported='no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <vmcoreinfo supported='yes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <genid supported='yes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <backingStoreInput supported='yes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <backup supported='yes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <async-teardown supported='yes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <s390-pv supported='no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <ps2 supported='yes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <tdx supported='no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <sev supported='no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <sgx supported='no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <hyperv supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='features'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>relaxed</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>vapic</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>spinlocks</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>vpindex</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>runtime</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>synic</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>stimer</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>reset</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>vendor_id</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>frequencies</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>reenlightenment</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>tlbflush</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>ipi</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>avic</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>emsr_bitmap</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>xmm_input</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <defaults>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <spinlocks>4095</spinlocks>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <stimer_direct>on</stimer_direct>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <tlbflush_direct>on</tlbflush_direct>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <tlbflush_extended>on</tlbflush_extended>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </defaults>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </hyperv>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <launchSecurity supported='no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   </features>
Jan 26 14:55:54 compute-1 nova_compute[183403]: </domainCapabilities>
Jan 26 14:55:54 compute-1 nova_compute[183403]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Jan 26 14:55:54 compute-1 nova_compute[183403]: 2026-01-26 14:55:54.256 183407 WARNING nova.virt.libvirt.driver [None req-5fc79986-1a15-4531-ac8a-3d3bd7f04801 - - - - - -] Cannot update service status on host "compute-1.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.
Jan 26 14:55:54 compute-1 nova_compute[183403]: 2026-01-26 14:55:54.257 183407 DEBUG nova.virt.libvirt.volume.mount [None req-5fc79986-1a15-4531-ac8a-3d3bd7f04801 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/mount.py:130
Jan 26 14:55:54 compute-1 nova_compute[183403]: 2026-01-26 14:55:54.263 183407 DEBUG nova.virt.libvirt.host [None req-5fc79986-1a15-4531-ac8a-3d3bd7f04801 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Jan 26 14:55:54 compute-1 nova_compute[183403]: <domainCapabilities>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   <path>/usr/libexec/qemu-kvm</path>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   <domain>kvm</domain>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   <machine>pc-q35-rhel9.8.0</machine>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   <arch>x86_64</arch>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   <vcpu max='4096'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   <iothreads supported='yes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   <os supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <enum name='firmware'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <value>efi</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <loader supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='type'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>rom</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>pflash</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='readonly'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>yes</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>no</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='secure'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>yes</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>no</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </loader>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   </os>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   <cpu>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <mode name='host-passthrough' supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='hostPassthroughMigratable'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>on</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>off</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </mode>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <mode name='maximum' supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='maximumMigratable'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>on</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>off</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </mode>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <mode name='host-model' supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <vendor>AMD</vendor>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='x2apic'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='tsc-deadline'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='hypervisor'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='tsc_adjust'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='spec-ctrl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='stibp'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='ssbd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='cmp_legacy'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='overflow-recov'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='succor'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='ibrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='amd-ssbd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='virt-ssbd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='lbrv'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='tsc-scale'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='vmcb-clean'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='flushbyasid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='pause-filter'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='pfthreshold'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='svme-addr-chk'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <feature policy='disable' name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </mode>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <mode name='custom' supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Broadwell'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Broadwell-IBRS'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Broadwell-noTSX'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Broadwell-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Broadwell-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Broadwell-v3'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Broadwell-v4'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Cascadelake-Server'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Cascadelake-Server-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Cascadelake-Server-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Cascadelake-Server-v3'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Cascadelake-Server-v4'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Cascadelake-Server-v5'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='ClearwaterForest'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-ne-convert'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni-int16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bhi-ctrl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bhi-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cldemote'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cmpccxadd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ddpd-u'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fbsdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='intel-psfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ipred-ctrl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='lam'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='mcdt-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pbrsb-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='prefetchiti'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='psdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rrsba-ctrl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbdr-ssdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sha512'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sm3'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sm4'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='ClearwaterForest-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-ne-convert'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni-int16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bhi-ctrl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bhi-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cldemote'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cmpccxadd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ddpd-u'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fbsdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='intel-psfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ipred-ctrl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='lam'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='mcdt-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pbrsb-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='prefetchiti'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='psdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rrsba-ctrl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbdr-ssdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sha512'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sm3'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sm4'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Cooperlake'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Cooperlake-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Cooperlake-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Denverton'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='mpx'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Denverton-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='mpx'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Denverton-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Denverton-v3'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Dhyana-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-Genoa'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amd-psfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='auto-ibrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='no-nested-data-bp'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='null-sel-clr-base'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='stibp-always-on'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-Genoa-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amd-psfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='auto-ibrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='no-nested-data-bp'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='null-sel-clr-base'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='stibp-always-on'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-Genoa-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amd-psfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='auto-ibrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fs-gs-base-ns'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='no-nested-data-bp'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='null-sel-clr-base'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='perfmon-v2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='stibp-always-on'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-Milan'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-Milan-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-Milan-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amd-psfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='no-nested-data-bp'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='null-sel-clr-base'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='stibp-always-on'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-Milan-v3'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amd-psfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='no-nested-data-bp'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='null-sel-clr-base'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='stibp-always-on'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-Rome'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-Rome-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-Rome-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-Rome-v3'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-Turin'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amd-psfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='auto-ibrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vp2intersect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fs-gs-base-ns'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibpb-brtype'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='no-nested-data-bp'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='null-sel-clr-base'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='perfmon-v2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='prefetchi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbpb'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='srso-user-kernel-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='stibp-always-on'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-Turin-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amd-psfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='auto-ibrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vp2intersect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fs-gs-base-ns'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibpb-brtype'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='no-nested-data-bp'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='null-sel-clr-base'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='perfmon-v2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='prefetchi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbpb'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='srso-user-kernel-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='stibp-always-on'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-v3'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-v4'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='EPYC-v5'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='GraniteRapids'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-fp16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-tile'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-fp16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fbsdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrc'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fzrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='mcdt-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pbrsb-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='prefetchiti'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='psdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbdr-ssdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='tsx-ldtrk'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='GraniteRapids-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-fp16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-tile'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-fp16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fbsdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrc'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fzrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='mcdt-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pbrsb-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='prefetchiti'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='psdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbdr-ssdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='tsx-ldtrk'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='GraniteRapids-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-fp16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-tile'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx10'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx10-128'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx10-256'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx10-512'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-fp16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cldemote'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fbsdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrc'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fzrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='mcdt-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pbrsb-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='prefetchiti'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='psdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbdr-ssdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='tsx-ldtrk'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='GraniteRapids-v3'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-fp16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-tile'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx10'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx10-128'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx10-256'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx10-512'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-fp16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cldemote'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fbsdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrc'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fzrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='mcdt-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pbrsb-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='prefetchiti'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='psdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbdr-ssdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='tsx-ldtrk'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Haswell'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Haswell-IBRS'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Haswell-noTSX'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Haswell-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Haswell-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Haswell-v3'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Haswell-v4'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Icelake-Server'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Icelake-Server-noTSX'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Icelake-Server-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Icelake-Server-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Icelake-Server-v3'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Icelake-Server-v4'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Icelake-Server-v5'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Icelake-Server-v6'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Icelake-Server-v7'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 systemd[1]: Starting libvirt nodedev daemon...
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='IvyBridge'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='IvyBridge-IBRS'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='IvyBridge-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='IvyBridge-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='KnightsMill'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-4fmaps'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-4vnniw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512er'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512pf'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='KnightsMill-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-4fmaps'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-4vnniw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512er'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512pf'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Opteron_G4'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fma4'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xop'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Opteron_G4-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fma4'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xop'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Opteron_G5'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fma4'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='tbm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xop'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Opteron_G5-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fma4'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='tbm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xop'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='SapphireRapids'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-tile'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-fp16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrc'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fzrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='tsx-ldtrk'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='SapphireRapids-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-tile'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-fp16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrc'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fzrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='tsx-ldtrk'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='SapphireRapids-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-tile'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-fp16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fbsdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrc'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fzrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='psdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbdr-ssdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='tsx-ldtrk'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='SapphireRapids-v3'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-tile'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-fp16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cldemote'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fbsdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrc'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fzrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='psdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbdr-ssdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='tsx-ldtrk'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='SapphireRapids-v4'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='amx-tile'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-bf16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-fp16'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512-vpopcntdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bitalg'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vbmi2'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cldemote'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fbsdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrc'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fzrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='la57'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='psdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbdr-ssdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='taa-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='tsx-ldtrk'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='SierraForest'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-ne-convert'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cmpccxadd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fbsdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='mcdt-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pbrsb-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='psdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbdr-ssdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='SierraForest-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-ne-convert'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cmpccxadd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fbsdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='mcdt-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pbrsb-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='psdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbdr-ssdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='SierraForest-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-ne-convert'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bhi-ctrl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cldemote'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cmpccxadd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fbsdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='intel-psfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ipred-ctrl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='lam'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='mcdt-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pbrsb-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='psdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rrsba-ctrl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbdr-ssdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='SierraForest-v3'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-ifma'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-ne-convert'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx-vnni-int8'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bhi-ctrl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='bus-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cldemote'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cmpccxadd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fbsdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='fsrs'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ibrs-all'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='intel-psfd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ipred-ctrl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='lam'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='mcdt-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pbrsb-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='psdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rrsba-ctrl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='sbdr-ssdp-no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='serialize'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vaes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='vpclmulqdq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Client'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Client-IBRS'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Client-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Client-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Client-v3'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Client-v4'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Server'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Server-IBRS'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Server-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Server-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='hle'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='rtm'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Server-v3'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Server-v4'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Skylake-Server-v5'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512bw'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512cd'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512dq'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512f'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='avx512vl'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='invpcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pcid'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='pku'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Snowridge'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cldemote'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='core-capability'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='mpx'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='split-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Snowridge-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cldemote'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='core-capability'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='mpx'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='split-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Snowridge-v2'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cldemote'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='core-capability'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='split-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Snowridge-v3'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cldemote'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='core-capability'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='split-lock-detect'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='Snowridge-v4'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='cldemote'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='erms'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='gfni'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdir64b'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='movdiri'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='xsaves'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='athlon'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='3dnow'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='3dnowext'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='athlon-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='3dnow'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='3dnowext'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='core2duo'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='core2duo-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='coreduo'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='coreduo-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='n270'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='n270-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='ss'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='phenom'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='3dnow'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='3dnowext'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <blockers model='phenom-v1'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='3dnow'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <feature name='3dnowext'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </blockers>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </mode>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   </cpu>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   <memoryBacking supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <enum name='sourceType'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <value>file</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <value>anonymous</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <value>memfd</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   </memoryBacking>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   <devices>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <disk supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='diskDevice'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>disk</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>cdrom</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>floppy</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>lun</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='bus'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>fdc</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>scsi</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>virtio</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>usb</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>sata</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='model'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>virtio</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>virtio-transitional</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>virtio-non-transitional</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </disk>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <graphics supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='type'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>vnc</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>egl-headless</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>dbus</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </graphics>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <video supported='yes'>
Jan 26 14:55:54 compute-1 systemd[1]: Started libvirt nodedev daemon.
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='modelType'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>vga</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>cirrus</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>virtio</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>none</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>bochs</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>ramfb</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </video>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <hostdev supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='mode'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>subsystem</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='startupPolicy'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>default</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>mandatory</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>requisite</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>optional</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='subsysType'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>usb</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>pci</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>scsi</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='capsType'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='pciBackend'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </hostdev>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <rng supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='model'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>virtio</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>virtio-transitional</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>virtio-non-transitional</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='backendModel'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>random</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>egd</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>builtin</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </rng>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <filesystem supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='driverType'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>path</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>handle</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>virtiofs</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </filesystem>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <tpm supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='model'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>tpm-tis</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>tpm-crb</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='backendModel'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>emulator</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>external</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='backendVersion'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>2.0</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </tpm>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <redirdev supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='bus'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>usb</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </redirdev>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <channel supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='type'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>pty</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>unix</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </channel>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <crypto supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='model'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='type'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>qemu</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='backendModel'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>builtin</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </crypto>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <interface supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='backendType'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>default</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>passt</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </interface>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <panic supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='model'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>isa</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>hyperv</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </panic>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <console supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='type'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>null</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>vc</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>pty</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>dev</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>file</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>pipe</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>stdio</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>udp</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>tcp</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>unix</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>qemu-vdagent</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>dbus</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </console>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   </devices>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   <features>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <gic supported='no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <vmcoreinfo supported='yes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <genid supported='yes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <backingStoreInput supported='yes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <backup supported='yes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <async-teardown supported='yes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <s390-pv supported='no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <ps2 supported='yes'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <tdx supported='no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <sev supported='no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <sgx supported='no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <hyperv supported='yes'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <enum name='features'>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>relaxed</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>vapic</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>spinlocks</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>vpindex</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>runtime</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>synic</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>stimer</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>reset</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>vendor_id</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>frequencies</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>reenlightenment</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>tlbflush</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>ipi</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>avic</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>emsr_bitmap</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <value>xmm_input</value>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </enum>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       <defaults>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <spinlocks>4095</spinlocks>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <stimer_direct>on</stimer_direct>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <tlbflush_direct>on</tlbflush_direct>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <tlbflush_extended>on</tlbflush_extended>
Jan 26 14:55:54 compute-1 nova_compute[183403]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 26 14:55:54 compute-1 nova_compute[183403]:       </defaults>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     </hyperv>
Jan 26 14:55:54 compute-1 nova_compute[183403]:     <launchSecurity supported='no'/>
Jan 26 14:55:54 compute-1 nova_compute[183403]:   </features>
Jan 26 14:55:54 compute-1 nova_compute[183403]: </domainCapabilities>
Jan 26 14:55:54 compute-1 nova_compute[183403]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Jan 26 14:55:54 compute-1 nova_compute[183403]: 2026-01-26 14:55:54.363 183407 DEBUG nova.virt.libvirt.host [None req-5fc79986-1a15-4531-ac8a-3d3bd7f04801 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1877
Jan 26 14:55:54 compute-1 nova_compute[183403]: 2026-01-26 14:55:54.363 183407 DEBUG nova.virt.libvirt.host [None req-5fc79986-1a15-4531-ac8a-3d3bd7f04801 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1877
Jan 26 14:55:54 compute-1 nova_compute[183403]: 2026-01-26 14:55:54.363 183407 DEBUG nova.virt.libvirt.host [None req-5fc79986-1a15-4531-ac8a-3d3bd7f04801 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1877
Jan 26 14:55:54 compute-1 nova_compute[183403]: 2026-01-26 14:55:54.371 183407 INFO nova.virt.libvirt.host [None req-5fc79986-1a15-4531-ac8a-3d3bd7f04801 - - - - - -] Secure Boot support detected
Jan 26 14:55:54 compute-1 nova_compute[183403]: 2026-01-26 14:55:54.379 183407 INFO nova.virt.libvirt.driver [None req-5fc79986-1a15-4531-ac8a-3d3bd7f04801 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Jan 26 14:55:54 compute-1 nova_compute[183403]: 2026-01-26 14:55:54.379 183407 INFO nova.virt.libvirt.driver [None req-5fc79986-1a15-4531-ac8a-3d3bd7f04801 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Jan 26 14:55:54 compute-1 nova_compute[183403]: 2026-01-26 14:55:54.556 183407 DEBUG nova.virt.libvirt.driver [None req-5fc79986-1a15-4531-ac8a-3d3bd7f04801 - - - - - -] cpu compare xml: <cpu match="exact">
Jan 26 14:55:54 compute-1 nova_compute[183403]:   <model>Nehalem</model>
Jan 26 14:55:54 compute-1 nova_compute[183403]: </cpu>
Jan 26 14:55:54 compute-1 nova_compute[183403]:  _compare_cpu /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10922
Jan 26 14:55:54 compute-1 nova_compute[183403]: 2026-01-26 14:55:54.559 183407 DEBUG nova.virt.libvirt.driver [None req-5fc79986-1a15-4531-ac8a-3d3bd7f04801 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1177
Jan 26 14:55:55 compute-1 nova_compute[183403]: 2026-01-26 14:55:55.633 183407 INFO nova.virt.node [None req-5fc79986-1a15-4531-ac8a-3d3bd7f04801 - - - - - -] Determined node identity e3eb07a3-6ab4-4f51-ad76-347430ed2b67 from /var/lib/nova/compute_id
Jan 26 14:55:56 compute-1 nova_compute[183403]: 2026-01-26 14:55:56.144 183407 WARNING nova.compute.manager [None req-5fc79986-1a15-4531-ac8a-3d3bd7f04801 - - - - - -] Compute nodes ['e3eb07a3-6ab4-4f51-ad76-347430ed2b67'] for host compute-1.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Jan 26 14:55:56 compute-1 sshd-session[183730]: Accepted publickey for zuul from 192.168.122.30 port 59636 ssh2: ECDSA SHA256:jdvz2lyr5EyIYu+bvBKw53Euf9HIejup115smDZmKeU
Jan 26 14:55:56 compute-1 systemd-logind[795]: New session 27 of user zuul.
Jan 26 14:55:56 compute-1 systemd[1]: Started Session 27 of User zuul.
Jan 26 14:55:57 compute-1 sshd-session[183730]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 14:55:57 compute-1 nova_compute[183403]: 2026-01-26 14:55:57.156 183407 INFO nova.compute.manager [None req-5fc79986-1a15-4531-ac8a-3d3bd7f04801 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Jan 26 14:55:57 compute-1 sshd-session[183567]: Invalid user hduser from 185.246.128.170 port 30782
Jan 26 14:55:58 compute-1 python3.9[183883]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 14:55:58 compute-1 nova_compute[183403]: 2026-01-26 14:55:58.339 183407 WARNING nova.compute.manager [None req-5fc79986-1a15-4531-ac8a-3d3bd7f04801 - - - - - -] No compute node record found for host compute-1.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.
Jan 26 14:55:58 compute-1 nova_compute[183403]: 2026-01-26 14:55:58.339 183407 DEBUG oslo_concurrency.lockutils [None req-5fc79986-1a15-4531-ac8a-3d3bd7f04801 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 14:55:58 compute-1 nova_compute[183403]: 2026-01-26 14:55:58.340 183407 DEBUG oslo_concurrency.lockutils [None req-5fc79986-1a15-4531-ac8a-3d3bd7f04801 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 14:55:58 compute-1 nova_compute[183403]: 2026-01-26 14:55:58.340 183407 DEBUG oslo_concurrency.lockutils [None req-5fc79986-1a15-4531-ac8a-3d3bd7f04801 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 14:55:58 compute-1 nova_compute[183403]: 2026-01-26 14:55:58.340 183407 DEBUG nova.compute.resource_tracker [None req-5fc79986-1a15-4531-ac8a-3d3bd7f04801 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 14:55:58 compute-1 nova_compute[183403]: 2026-01-26 14:55:58.490 183407 WARNING nova.virt.libvirt.driver [None req-5fc79986-1a15-4531-ac8a-3d3bd7f04801 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 14:55:58 compute-1 nova_compute[183403]: 2026-01-26 14:55:58.491 183407 DEBUG oslo_concurrency.processutils [None req-5fc79986-1a15-4531-ac8a-3d3bd7f04801 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 14:55:58 compute-1 nova_compute[183403]: 2026-01-26 14:55:58.508 183407 DEBUG oslo_concurrency.processutils [None req-5fc79986-1a15-4531-ac8a-3d3bd7f04801 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 14:55:58 compute-1 nova_compute[183403]: 2026-01-26 14:55:58.509 183407 DEBUG nova.compute.resource_tracker [None req-5fc79986-1a15-4531-ac8a-3d3bd7f04801 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6168MB free_disk=73.34980392456055GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 14:55:58 compute-1 nova_compute[183403]: 2026-01-26 14:55:58.509 183407 DEBUG oslo_concurrency.lockutils [None req-5fc79986-1a15-4531-ac8a-3d3bd7f04801 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 14:55:58 compute-1 nova_compute[183403]: 2026-01-26 14:55:58.509 183407 DEBUG oslo_concurrency.lockutils [None req-5fc79986-1a15-4531-ac8a-3d3bd7f04801 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 14:55:59 compute-1 nova_compute[183403]: 2026-01-26 14:55:59.020 183407 WARNING nova.compute.resource_tracker [None req-5fc79986-1a15-4531-ac8a-3d3bd7f04801 - - - - - -] No compute node record for compute-1.ctlplane.example.com:e3eb07a3-6ab4-4f51-ad76-347430ed2b67: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host e3eb07a3-6ab4-4f51-ad76-347430ed2b67 could not be found.
Jan 26 14:55:59 compute-1 nova_compute[183403]: 2026-01-26 14:55:59.533 183407 INFO nova.compute.resource_tracker [None req-5fc79986-1a15-4531-ac8a-3d3bd7f04801 - - - - - -] Compute node record created for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com with uuid: e3eb07a3-6ab4-4f51-ad76-347430ed2b67
Jan 26 14:55:59 compute-1 sudo[184038]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-leqoaqbxpdzsjbhcavxxqmmhbcjjvbso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439358.9002295-48-237713970053652/AnsiballZ_systemd_service.py'
Jan 26 14:55:59 compute-1 sudo[184038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:55:59 compute-1 python3.9[184040]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 14:55:59 compute-1 systemd[1]: Reloading.
Jan 26 14:56:00 compute-1 systemd-rc-local-generator[184063]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:56:00 compute-1 systemd-sysv-generator[184067]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 14:56:00 compute-1 sudo[184038]: pam_unix(sudo:session): session closed for user root
Jan 26 14:56:01 compute-1 nova_compute[183403]: 2026-01-26 14:56:01.064 183407 DEBUG nova.compute.resource_tracker [None req-5fc79986-1a15-4531-ac8a-3d3bd7f04801 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 14:56:01 compute-1 nova_compute[183403]: 2026-01-26 14:56:01.064 183407 DEBUG nova.compute.resource_tracker [None req-5fc79986-1a15-4531-ac8a-3d3bd7f04801 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:55:58 up 51 min,  0 user,  load average: 0.83, 0.90, 0.74\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 14:56:01 compute-1 python3.9[184225]: ansible-ansible.builtin.service_facts Invoked
Jan 26 14:56:01 compute-1 network[184242]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 26 14:56:01 compute-1 network[184243]: 'network-scripts' will be removed from distribution in near future.
Jan 26 14:56:01 compute-1 network[184244]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 26 14:56:02 compute-1 sshd-session[183567]: Disconnecting invalid user hduser 185.246.128.170 port 30782: Change of username or service not allowed: (hduser,ssh-connection) -> (seki,ssh-connection) [preauth]
Jan 26 14:56:03 compute-1 nova_compute[183403]: 2026-01-26 14:56:03.134 183407 INFO nova.scheduler.client.report [None req-5fc79986-1a15-4531-ac8a-3d3bd7f04801 - - - - - -] [req-eb41f17d-a313-4ed7-b9fa-f9bf2a177591] Created resource provider record via placement API for resource provider with UUID e3eb07a3-6ab4-4f51-ad76-347430ed2b67 and name compute-1.ctlplane.example.com.
Jan 26 14:56:03 compute-1 nova_compute[183403]: 2026-01-26 14:56:03.181 183407 DEBUG nova.virt.libvirt.host [None req-5fc79986-1a15-4531-ac8a-3d3bd7f04801 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Jan 26 14:56:03 compute-1 nova_compute[183403]: ] _kernel_supports_amd_sev /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1953
Jan 26 14:56:03 compute-1 nova_compute[183403]: 2026-01-26 14:56:03.181 183407 INFO nova.virt.libvirt.host [None req-5fc79986-1a15-4531-ac8a-3d3bd7f04801 - - - - - -] kernel doesn't support AMD SEV
Jan 26 14:56:03 compute-1 nova_compute[183403]: 2026-01-26 14:56:03.182 183407 DEBUG nova.compute.provider_tree [None req-5fc79986-1a15-4531-ac8a-3d3bd7f04801 - - - - - -] Updating inventory in ProviderTree for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Jan 26 14:56:03 compute-1 nova_compute[183403]: 2026-01-26 14:56:03.182 183407 DEBUG nova.virt.libvirt.driver [None req-5fc79986-1a15-4531-ac8a-3d3bd7f04801 - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Jan 26 14:56:03 compute-1 nova_compute[183403]: 2026-01-26 14:56:03.184 183407 DEBUG nova.virt.libvirt.driver [None req-5fc79986-1a15-4531-ac8a-3d3bd7f04801 - - - - - -] Libvirt baseline CPU <cpu>
Jan 26 14:56:03 compute-1 nova_compute[183403]:   <arch>x86_64</arch>
Jan 26 14:56:03 compute-1 nova_compute[183403]:   <model>Nehalem</model>
Jan 26 14:56:03 compute-1 nova_compute[183403]:   <vendor>AMD</vendor>
Jan 26 14:56:03 compute-1 nova_compute[183403]:   <topology sockets="8" cores="1" threads="1"/>
Jan 26 14:56:03 compute-1 nova_compute[183403]:   <maxphysaddr mode="emulate" bits="40"/>
Jan 26 14:56:03 compute-1 nova_compute[183403]: </cpu>
Jan 26 14:56:03 compute-1 nova_compute[183403]:  _get_guest_baseline_cpu_features /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13545
Jan 26 14:56:03 compute-1 nova_compute[183403]: 2026-01-26 14:56:03.765 183407 DEBUG nova.scheduler.client.report [None req-5fc79986-1a15-4531-ac8a-3d3bd7f04801 - - - - - -] Updated inventory for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:975
Jan 26 14:56:03 compute-1 nova_compute[183403]: 2026-01-26 14:56:03.765 183407 DEBUG nova.compute.provider_tree [None req-5fc79986-1a15-4531-ac8a-3d3bd7f04801 - - - - - -] Updating resource provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Jan 26 14:56:03 compute-1 nova_compute[183403]: 2026-01-26 14:56:03.765 183407 DEBUG nova.compute.provider_tree [None req-5fc79986-1a15-4531-ac8a-3d3bd7f04801 - - - - - -] Updating inventory in ProviderTree for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Jan 26 14:56:03 compute-1 nova_compute[183403]: 2026-01-26 14:56:03.894 183407 DEBUG nova.compute.provider_tree [None req-5fc79986-1a15-4531-ac8a-3d3bd7f04801 - - - - - -] Updating resource provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Jan 26 14:56:04 compute-1 nova_compute[183403]: 2026-01-26 14:56:04.404 183407 DEBUG nova.compute.resource_tracker [None req-5fc79986-1a15-4531-ac8a-3d3bd7f04801 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 14:56:04 compute-1 nova_compute[183403]: 2026-01-26 14:56:04.404 183407 DEBUG oslo_concurrency.lockutils [None req-5fc79986-1a15-4531-ac8a-3d3bd7f04801 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 5.895s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 14:56:04 compute-1 nova_compute[183403]: 2026-01-26 14:56:04.405 183407 DEBUG nova.service [None req-5fc79986-1a15-4531-ac8a-3d3bd7f04801 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.12/site-packages/nova/service.py:177
Jan 26 14:56:04 compute-1 nova_compute[183403]: 2026-01-26 14:56:04.510 183407 DEBUG nova.service [None req-5fc79986-1a15-4531-ac8a-3d3bd7f04801 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.12/site-packages/nova/service.py:194
Jan 26 14:56:04 compute-1 nova_compute[183403]: 2026-01-26 14:56:04.511 183407 DEBUG nova.servicegroup.drivers.db [None req-5fc79986-1a15-4531-ac8a-3d3bd7f04801 - - - - - -] DB_Driver: join new ServiceGroup member compute-1.ctlplane.example.com to the compute group, service = <Service: host=compute-1.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.12/site-packages/nova/servicegroup/drivers/db.py:44
Jan 26 14:56:05 compute-1 podman[184331]: 2026-01-26 14:56:05.035959546 +0000 UTC m=+0.052358899 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Jan 26 14:56:05 compute-1 podman[184329]: 2026-01-26 14:56:05.075599693 +0000 UTC m=+0.093433324 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260120)
Jan 26 14:56:06 compute-1 sudo[184559]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnwzzdodpdjasyiqqtuvnogafydmzrdo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439366.0533378-86-209141047918083/AnsiballZ_systemd_service.py'
Jan 26 14:56:06 compute-1 sudo[184559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:56:06 compute-1 python3.9[184561]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 14:56:06 compute-1 sudo[184559]: pam_unix(sudo:session): session closed for user root
Jan 26 14:56:07 compute-1 sudo[184712]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzkredyjkckymmyblsrqlomsaohgdjux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439367.0566664-106-262725876426255/AnsiballZ_file.py'
Jan 26 14:56:07 compute-1 sudo[184712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:56:07 compute-1 python3.9[184714]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:56:07 compute-1 sudo[184712]: pam_unix(sudo:session): session closed for user root
Jan 26 14:56:07 compute-1 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 14:56:07 compute-1 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 14:56:08 compute-1 sudo[184865]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhxakwfmiqvmnqawaksxfaklmtijxbof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439367.926488-122-80184371107222/AnsiballZ_file.py'
Jan 26 14:56:08 compute-1 sudo[184865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:56:08 compute-1 python3.9[184867]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:56:08 compute-1 sudo[184865]: pam_unix(sudo:session): session closed for user root
Jan 26 14:56:08 compute-1 sshd-session[184280]: Invalid user seki from 185.246.128.170 port 21338
Jan 26 14:56:09 compute-1 sudo[185017]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-opmumiwodqcftdhrltdgsocritvicolu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439368.7167554-140-270917519405254/AnsiballZ_command.py'
Jan 26 14:56:09 compute-1 sudo[185017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:56:09 compute-1 python3.9[185019]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:56:09 compute-1 sudo[185017]: pam_unix(sudo:session): session closed for user root
Jan 26 14:56:10 compute-1 python3.9[185171]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 26 14:56:11 compute-1 sudo[185321]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqdkahaicikklnedeaooiyjbxwbrgthr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439370.7555137-176-204491490394913/AnsiballZ_systemd_service.py'
Jan 26 14:56:11 compute-1 sudo[185321]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:56:11 compute-1 python3.9[185323]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 14:56:11 compute-1 systemd[1]: Reloading.
Jan 26 14:56:11 compute-1 systemd-rc-local-generator[185352]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:56:11 compute-1 systemd-sysv-generator[185356]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 14:56:11 compute-1 sudo[185321]: pam_unix(sudo:session): session closed for user root
Jan 26 14:56:12 compute-1 sudo[185508]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djyjstfmwexdgoqqcpmuwnwfzyneohuj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439371.8914704-192-115127643188738/AnsiballZ_command.py'
Jan 26 14:56:12 compute-1 sudo[185508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:56:12 compute-1 python3.9[185510]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:56:12 compute-1 sudo[185508]: pam_unix(sudo:session): session closed for user root
Jan 26 14:56:12 compute-1 sshd-session[184280]: Disconnecting invalid user seki 185.246.128.170 port 21338: Change of username or service not allowed: (seki,ssh-connection) -> (t128,ssh-connection) [preauth]
Jan 26 14:56:14 compute-1 sudo[185661]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygmkkyaemvxqemmjvbldharawrznzqzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439373.760308-210-242122260848327/AnsiballZ_file.py'
Jan 26 14:56:14 compute-1 sudo[185661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:56:14 compute-1 python3.9[185663]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:56:14 compute-1 sudo[185661]: pam_unix(sudo:session): session closed for user root
Jan 26 14:56:15 compute-1 python3.9[185813]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 14:56:15 compute-1 sudo[185965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jivureijiadgmhpjjxuftdzbvvxpwbfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439375.5035524-242-155527996892707/AnsiballZ_group.py'
Jan 26 14:56:15 compute-1 sudo[185965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:56:16 compute-1 python3.9[185967]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Jan 26 14:56:16 compute-1 sudo[185965]: pam_unix(sudo:session): session closed for user root
Jan 26 14:56:17 compute-1 sudo[186117]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-icooqrwoojjabmexyzfmeztqbstwdsov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439376.5741284-264-33002805994640/AnsiballZ_getent.py'
Jan 26 14:56:17 compute-1 sudo[186117]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:56:17 compute-1 python3.9[186119]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Jan 26 14:56:17 compute-1 sudo[186117]: pam_unix(sudo:session): session closed for user root
Jan 26 14:56:17 compute-1 sudo[186271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onvxejzrnwdjfeiijljedeedfcdlgeya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439377.5775318-280-16382986297676/AnsiballZ_group.py'
Jan 26 14:56:17 compute-1 sudo[186271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:56:18 compute-1 python3.9[186273]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 26 14:56:18 compute-1 groupadd[186274]: group added to /etc/group: name=ceilometer, GID=42405
Jan 26 14:56:18 compute-1 groupadd[186274]: group added to /etc/gshadow: name=ceilometer
Jan 26 14:56:18 compute-1 groupadd[186274]: new group: name=ceilometer, GID=42405
Jan 26 14:56:18 compute-1 sudo[186271]: pam_unix(sudo:session): session closed for user root
Jan 26 14:56:18 compute-1 sudo[186430]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnmrljsivektfrqkwbirpzutlmavmgev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439378.343812-296-247198536560188/AnsiballZ_user.py'
Jan 26 14:56:18 compute-1 sudo[186430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:56:19 compute-1 python3.9[186432]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 26 14:56:19 compute-1 useradd[186434]: new user: name=ceilometer, UID=42405, GID=42405, home=/home/ceilometer, shell=/sbin/nologin, from=/dev/pts/0
Jan 26 14:56:19 compute-1 useradd[186434]: add 'ceilometer' to group 'libvirt'
Jan 26 14:56:19 compute-1 useradd[186434]: add 'ceilometer' to shadow group 'libvirt'
Jan 26 14:56:19 compute-1 sudo[186430]: pam_unix(sudo:session): session closed for user root
Jan 26 14:56:19 compute-1 sshd-session[186145]: Invalid user t128 from 185.246.128.170 port 54313
Jan 26 14:56:19 compute-1 sshd-session[186145]: Disconnecting invalid user t128 185.246.128.170 port 54313: Change of username or service not allowed: (t128,ssh-connection) -> (squid,ssh-connection) [preauth]
Jan 26 14:56:21 compute-1 python3.9[186590]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:56:22 compute-1 python3.9[186711]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1769439380.7647538-348-74337311187858/.source.conf _original_basename=ceilometer.conf follow=False checksum=806b21daa538a66a80669be8bf74c414d178dfbc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:56:22 compute-1 python3.9[186862]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:56:23 compute-1 python3.9[186983]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1769439382.1742494-348-182399600344587/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:56:23 compute-1 python3.9[187134]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:56:24 compute-1 python3.9[187255]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1769439383.3262246-348-184378952837350/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:56:25 compute-1 python3.9[187405]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 14:56:26 compute-1 python3.9[187557]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 14:56:26 compute-1 python3.9[187709]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:56:27 compute-1 python3.9[187830]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769439386.3159392-466-176822369662094/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=a9bdb897f3979025d9a372b4beff53a09cbe0d55 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:56:27 compute-1 python3.9[187980]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:56:28 compute-1 python3.9[188101]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/openstack_network_exporter.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769439387.5250041-466-275656166828356/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=87dede51a10e22722618c1900db75cb764463d91 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:56:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:56:29.006 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 14:56:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:56:29.008 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 14:56:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:56:29.008 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 14:56:29 compute-1 python3.9[188252]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:56:30 compute-1 python3.9[188373]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/firewall.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769439389.316584-524-202289049980159/.source.yaml _original_basename=firewall.yaml follow=False checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:56:31 compute-1 python3.9[188523]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:56:31 compute-1 python3.9[188644]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769439390.7452178-556-209558149889776/.source.yaml _original_basename=node_exporter.yaml follow=False checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:56:32 compute-1 python3.9[188794]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:56:33 compute-1 python3.9[188915]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769439391.9685633-586-186952765126628/.source.yaml _original_basename=podman_exporter.yaml follow=False checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:56:33 compute-1 python3.9[189065]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:56:34 compute-1 python3.9[189186]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769439393.2058263-616-243015645591703/.source.yaml _original_basename=ceilometer_prom_exporter.yaml follow=False checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:56:34 compute-1 sudo[189336]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mobhmovevwnrsjtqldyobplxgxujbxyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439394.5489595-646-13424702257472/AnsiballZ_file.py'
Jan 26 14:56:34 compute-1 sudo[189336]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:56:35 compute-1 python3.9[189338]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:56:35 compute-1 sudo[189336]: pam_unix(sudo:session): session closed for user root
Jan 26 14:56:35 compute-1 podman[189340]: 2026-01-26 14:56:35.200506681 +0000 UTC m=+0.053028687 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 26 14:56:35 compute-1 podman[189339]: 2026-01-26 14:56:35.260373821 +0000 UTC m=+0.113186725 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260120)
Jan 26 14:56:35 compute-1 sudo[189531]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohpvmzexxaypworwuywjgpzaglrfectq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439395.3315363-662-113490778157181/AnsiballZ_file.py'
Jan 26 14:56:35 compute-1 sudo[189531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:56:35 compute-1 sshd-session[186780]: Invalid user squid from 185.246.128.170 port 25726
Jan 26 14:56:35 compute-1 python3.9[189533]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:56:35 compute-1 sudo[189531]: pam_unix(sudo:session): session closed for user root
Jan 26 14:56:36 compute-1 sshd-session[186780]: Disconnecting invalid user squid 185.246.128.170 port 25726: Change of username or service not allowed: (squid,ssh-connection) -> (itadmin,ssh-connection) [preauth]
Jan 26 14:56:36 compute-1 python3.9[189683]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 14:56:37 compute-1 python3.9[189835]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 14:56:37 compute-1 nova_compute[183403]: 2026-01-26 14:56:37.514 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 14:56:37 compute-1 python3.9[189987]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 14:56:38 compute-1 nova_compute[183403]: 2026-01-26 14:56:38.183 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 14:56:38 compute-1 sudo[190141]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irwndiihajzowxezhhbyvdbfoxxygcyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439398.2880316-726-176692443980678/AnsiballZ_file.py'
Jan 26 14:56:38 compute-1 sudo[190141]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:56:38 compute-1 python3.9[190143]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:56:38 compute-1 sudo[190141]: pam_unix(sudo:session): session closed for user root
Jan 26 14:56:39 compute-1 sudo[190293]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qbzuioaibkefhapalmlszpdclawxygjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439398.988302-742-107032634030887/AnsiballZ_systemd_service.py'
Jan 26 14:56:39 compute-1 sudo[190293]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:56:39 compute-1 python3.9[190295]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 14:56:39 compute-1 systemd[1]: Reloading.
Jan 26 14:56:39 compute-1 systemd-rc-local-generator[190323]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:56:39 compute-1 systemd-sysv-generator[190327]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 14:56:40 compute-1 systemd[1]: Listening on Podman API Socket.
Jan 26 14:56:40 compute-1 sudo[190293]: pam_unix(sudo:session): session closed for user root
Jan 26 14:56:40 compute-1 sudo[190484]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfoijnmbieiwplmazyvwcxnclxiqnhda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439400.3328228-760-87799075906960/AnsiballZ_stat.py'
Jan 26 14:56:40 compute-1 sudo[190484]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:56:40 compute-1 python3.9[190486]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:56:40 compute-1 sudo[190484]: pam_unix(sudo:session): session closed for user root
Jan 26 14:56:41 compute-1 sudo[190607]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aszutyjubpmasevfveokbbmgqeqheirv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439400.3328228-760-87799075906960/AnsiballZ_copy.py'
Jan 26 14:56:41 compute-1 sudo[190607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:56:41 compute-1 python3.9[190609]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769439400.3328228-760-87799075906960/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:56:41 compute-1 sudo[190607]: pam_unix(sudo:session): session closed for user root
Jan 26 14:56:42 compute-1 sudo[190759]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uliwksbedwebfifqwalvussfyazodgdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439402.1023483-802-151013032825971/AnsiballZ_file.py'
Jan 26 14:56:42 compute-1 sudo[190759]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:56:42 compute-1 python3.9[190761]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:56:42 compute-1 sudo[190759]: pam_unix(sudo:session): session closed for user root
Jan 26 14:56:43 compute-1 sudo[190911]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvoxwxbulvqrnbifsuatzlfwisjxtbcm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439402.8743494-818-236627866612391/AnsiballZ_file.py'
Jan 26 14:56:43 compute-1 sudo[190911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:56:43 compute-1 python3.9[190913]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:56:43 compute-1 sudo[190911]: pam_unix(sudo:session): session closed for user root
Jan 26 14:56:44 compute-1 python3.9[191063]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/podman_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:56:46 compute-1 sudo[191484]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxsfdpzxwroxqfmpvgbpokrvuvvrchci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439405.8968282-886-158089046489700/AnsiballZ_container_config_data.py'
Jan 26 14:56:46 compute-1 sudo[191484]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:56:46 compute-1 python3.9[191486]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/podman_exporter config_pattern=*.json debug=False
Jan 26 14:56:46 compute-1 sudo[191484]: pam_unix(sudo:session): session closed for user root
Jan 26 14:56:47 compute-1 sudo[191636]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-idkmoeuzxurjkkcwjuvnqxenaipldeer ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439406.985821-908-248982246041106/AnsiballZ_container_config_hash.py'
Jan 26 14:56:47 compute-1 sudo[191636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:56:47 compute-1 python3.9[191638]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 26 14:56:47 compute-1 sudo[191636]: pam_unix(sudo:session): session closed for user root
Jan 26 14:56:48 compute-1 sudo[191788]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sagoduvggbhvxtrdqefmfyytwefucwga ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769439407.922057-928-107257126887224/AnsiballZ_edpm_container_manage.py'
Jan 26 14:56:48 compute-1 sudo[191788]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:56:48 compute-1 sshd-session[190014]: Invalid user itadmin from 185.246.128.170 port 7223
Jan 26 14:56:48 compute-1 python3[191790]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/podman_exporter config_id=podman_exporter config_overrides={} config_patterns=*.json containers=['podman_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Jan 26 14:56:48 compute-1 sshd-session[190014]: Disconnecting invalid user itadmin 185.246.128.170 port 7223: Change of username or service not allowed: (itadmin,ssh-connection) -> (marek,ssh-connection) [preauth]
Jan 26 14:56:50 compute-1 podman[191801]: 2026-01-26 14:56:50.671280601 +0000 UTC m=+1.946939876 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Jan 26 14:56:50 compute-1 podman[191898]: 2026-01-26 14:56:50.787825924 +0000 UTC m=+0.044618560 container create 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=podman_exporter, container_name=podman_exporter, managed_by=edpm_ansible)
Jan 26 14:56:50 compute-1 podman[191898]: 2026-01-26 14:56:50.762146354 +0000 UTC m=+0.018939020 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Jan 26 14:56:50 compute-1 python3[191790]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env CONTAINER_HOST=unix:///run/podman/podman.sock --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8 --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=podman_exporter --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Jan 26 14:56:50 compute-1 sudo[191788]: pam_unix(sudo:session): session closed for user root
Jan 26 14:56:51 compute-1 sudo[192087]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khwwfornpwsernctaxpjuejrlugdxpek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439411.1233313-944-196875319067870/AnsiballZ_stat.py'
Jan 26 14:56:51 compute-1 sudo[192087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:56:51 compute-1 nova_compute[183403]: 2026-01-26 14:56:51.577 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 14:56:51 compute-1 nova_compute[183403]: 2026-01-26 14:56:51.579 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 14:56:51 compute-1 nova_compute[183403]: 2026-01-26 14:56:51.579 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 14:56:51 compute-1 nova_compute[183403]: 2026-01-26 14:56:51.579 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 14:56:51 compute-1 nova_compute[183403]: 2026-01-26 14:56:51.579 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 14:56:51 compute-1 nova_compute[183403]: 2026-01-26 14:56:51.580 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 14:56:51 compute-1 nova_compute[183403]: 2026-01-26 14:56:51.580 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 14:56:51 compute-1 nova_compute[183403]: 2026-01-26 14:56:51.580 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 14:56:51 compute-1 nova_compute[183403]: 2026-01-26 14:56:51.580 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 14:56:51 compute-1 python3.9[192089]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 14:56:51 compute-1 sudo[192087]: pam_unix(sudo:session): session closed for user root
Jan 26 14:56:52 compute-1 nova_compute[183403]: 2026-01-26 14:56:52.228 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 14:56:52 compute-1 nova_compute[183403]: 2026-01-26 14:56:52.228 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 14:56:52 compute-1 nova_compute[183403]: 2026-01-26 14:56:52.228 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 14:56:52 compute-1 nova_compute[183403]: 2026-01-26 14:56:52.228 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 14:56:52 compute-1 sudo[192241]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffavlslyjkutpddexlkufrpneqczazan ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439411.9505923-962-141515034327412/AnsiballZ_file.py'
Jan 26 14:56:52 compute-1 sudo[192241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:56:52 compute-1 nova_compute[183403]: 2026-01-26 14:56:52.367 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 14:56:52 compute-1 nova_compute[183403]: 2026-01-26 14:56:52.369 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 14:56:52 compute-1 nova_compute[183403]: 2026-01-26 14:56:52.388 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.019s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 14:56:52 compute-1 nova_compute[183403]: 2026-01-26 14:56:52.389 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6071MB free_disk=73.30074310302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 14:56:52 compute-1 nova_compute[183403]: 2026-01-26 14:56:52.389 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 14:56:52 compute-1 nova_compute[183403]: 2026-01-26 14:56:52.389 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 14:56:52 compute-1 python3.9[192243]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:56:52 compute-1 sudo[192241]: pam_unix(sudo:session): session closed for user root
Jan 26 14:56:52 compute-1 sudo[192318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhholofvctkxufxljsfdaeobdbkbympb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439411.9505923-962-141515034327412/AnsiballZ_stat.py'
Jan 26 14:56:52 compute-1 sudo[192318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:56:52 compute-1 python3.9[192320]: ansible-stat Invoked with path=/etc/systemd/system/edpm_podman_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 14:56:52 compute-1 sudo[192318]: pam_unix(sudo:session): session closed for user root
Jan 26 14:56:53 compute-1 nova_compute[183403]: 2026-01-26 14:56:53.543 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 14:56:53 compute-1 nova_compute[183403]: 2026-01-26 14:56:53.543 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:56:52 up 52 min,  0 user,  load average: 0.71, 0.85, 0.73\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 14:56:53 compute-1 nova_compute[183403]: 2026-01-26 14:56:53.563 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 14:56:53 compute-1 sudo[192469]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzftlhnvtllnyafuhpdtixqswycynzsf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439412.9522498-962-93285135337017/AnsiballZ_copy.py'
Jan 26 14:56:53 compute-1 sudo[192469]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:56:53 compute-1 python3.9[192471]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769439412.9522498-962-93285135337017/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:56:53 compute-1 sudo[192469]: pam_unix(sudo:session): session closed for user root
Jan 26 14:56:54 compute-1 nova_compute[183403]: 2026-01-26 14:56:54.077 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 14:56:54 compute-1 sshd-session[191814]: Invalid user marek from 185.246.128.170 port 56168
Jan 26 14:56:54 compute-1 sudo[192545]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vdopgtfntiiewcdzkagsareydzcwyujf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439412.9522498-962-93285135337017/AnsiballZ_systemd.py'
Jan 26 14:56:54 compute-1 sudo[192545]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:56:54 compute-1 nova_compute[183403]: 2026-01-26 14:56:54.642 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 14:56:54 compute-1 nova_compute[183403]: 2026-01-26 14:56:54.642 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.253s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 14:56:54 compute-1 python3.9[192547]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 14:56:54 compute-1 systemd[1]: Reloading.
Jan 26 14:56:54 compute-1 systemd-rc-local-generator[192572]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:56:54 compute-1 systemd-sysv-generator[192578]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 14:56:55 compute-1 sudo[192545]: pam_unix(sudo:session): session closed for user root
Jan 26 14:56:55 compute-1 sudo[192656]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnwtrbykrortkzojuiefcwjckssevmno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439412.9522498-962-93285135337017/AnsiballZ_systemd.py'
Jan 26 14:56:55 compute-1 sudo[192656]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:56:55 compute-1 python3.9[192658]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 14:56:55 compute-1 systemd[1]: Reloading.
Jan 26 14:56:55 compute-1 systemd-sysv-generator[192689]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 14:56:55 compute-1 systemd-rc-local-generator[192686]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:56:56 compute-1 systemd[1]: Starting podman_exporter container...
Jan 26 14:56:56 compute-1 systemd[1]: Started libcrun container.
Jan 26 14:56:56 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/68e280e26427fb002ae7da09f1aea93a51621c16406e137cc26fe5dc6239c443/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Jan 26 14:56:56 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/68e280e26427fb002ae7da09f1aea93a51621c16406e137cc26fe5dc6239c443/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Jan 26 14:56:56 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f.
Jan 26 14:56:56 compute-1 podman[192698]: 2026-01-26 14:56:56.175266524 +0000 UTC m=+0.139230733 container init 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 14:56:56 compute-1 podman[192698]: 2026-01-26 14:56:56.20109306 +0000 UTC m=+0.165057199 container start 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 26 14:56:56 compute-1 podman_exporter[192713]: ts=2026-01-26T14:56:56.201Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Jan 26 14:56:56 compute-1 podman_exporter[192713]: ts=2026-01-26T14:56:56.201Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Jan 26 14:56:56 compute-1 podman_exporter[192713]: ts=2026-01-26T14:56:56.201Z caller=handler.go:94 level=info msg="enabled collectors"
Jan 26 14:56:56 compute-1 podman_exporter[192713]: ts=2026-01-26T14:56:56.201Z caller=handler.go:105 level=info collector=container
Jan 26 14:56:56 compute-1 podman[192698]: podman_exporter
Jan 26 14:56:56 compute-1 systemd[1]: Starting Podman API Service...
Jan 26 14:56:56 compute-1 systemd[1]: Started Podman API Service.
Jan 26 14:56:56 compute-1 systemd[1]: Started podman_exporter container.
Jan 26 14:56:56 compute-1 podman[192725]: time="2026-01-26T14:56:56Z" level=info msg="/usr/bin/podman filtering at log level info"
Jan 26 14:56:56 compute-1 podman[192725]: time="2026-01-26T14:56:56Z" level=info msg="Setting parallel job count to 25"
Jan 26 14:56:56 compute-1 podman[192725]: time="2026-01-26T14:56:56Z" level=info msg="Using sqlite as database backend"
Jan 26 14:56:56 compute-1 podman[192725]: time="2026-01-26T14:56:56Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Jan 26 14:56:56 compute-1 podman[192725]: time="2026-01-26T14:56:56Z" level=info msg="Using systemd socket activation to determine API endpoint"
Jan 26 14:56:56 compute-1 podman[192725]: time="2026-01-26T14:56:56Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Jan 26 14:56:56 compute-1 podman[192725]: @ - - [26/Jan/2026:14:56:56 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Jan 26 14:56:56 compute-1 podman[192725]: time="2026-01-26T14:56:56Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 14:56:56 compute-1 sudo[192656]: pam_unix(sudo:session): session closed for user root
Jan 26 14:56:56 compute-1 podman[192725]: @ - - [26/Jan/2026:14:56:56 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 12118 "" "Go-http-client/1.1"
Jan 26 14:56:56 compute-1 podman_exporter[192713]: ts=2026-01-26T14:56:56.283Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Jan 26 14:56:56 compute-1 podman_exporter[192713]: ts=2026-01-26T14:56:56.284Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Jan 26 14:56:56 compute-1 podman_exporter[192713]: ts=2026-01-26T14:56:56.285Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Jan 26 14:56:56 compute-1 podman[192722]: 2026-01-26 14:56:56.287280649 +0000 UTC m=+0.065659614 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 26 14:56:56 compute-1 systemd[1]: 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f-7b1653b8f12694fb.service: Main process exited, code=exited, status=1/FAILURE
Jan 26 14:56:56 compute-1 systemd[1]: 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f-7b1653b8f12694fb.service: Failed with result 'exit-code'.
Jan 26 14:56:57 compute-1 python3.9[192910]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 26 14:56:58 compute-1 sshd-session[191814]: Disconnecting invalid user marek 185.246.128.170 port 56168: Change of username or service not allowed: (marek,ssh-connection) -> (ftp_inst,ssh-connection) [preauth]
Jan 26 14:56:58 compute-1 sudo[193060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzwbwikawcemtquzgvmwucbawidccqvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439418.5426147-1052-194617049035837/AnsiballZ_stat.py'
Jan 26 14:56:58 compute-1 sudo[193060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:56:59 compute-1 python3.9[193062]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:56:59 compute-1 sudo[193060]: pam_unix(sudo:session): session closed for user root
Jan 26 14:56:59 compute-1 sudo[193185]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtqfaxnewcbxbsnpycdekjzgkhazjljp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439418.5426147-1052-194617049035837/AnsiballZ_copy.py'
Jan 26 14:56:59 compute-1 sudo[193185]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:56:59 compute-1 python3.9[193187]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769439418.5426147-1052-194617049035837/.source.yaml _original_basename=.r21od58w follow=False checksum=553e570b5231c539b270f06e16fa0b248613a4ff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:56:59 compute-1 sudo[193185]: pam_unix(sudo:session): session closed for user root
Jan 26 14:57:00 compute-1 sudo[193337]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zaxhbjjftbjzrelxgojkruwhpddscios ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439419.811445-1082-182802603361743/AnsiballZ_stat.py'
Jan 26 14:57:00 compute-1 sudo[193337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:57:00 compute-1 python3.9[193339]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:57:00 compute-1 sudo[193337]: pam_unix(sudo:session): session closed for user root
Jan 26 14:57:00 compute-1 sudo[193460]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gtmqcuegybequmiihovzzgtiulhqaels ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439419.811445-1082-182802603361743/AnsiballZ_copy.py'
Jan 26 14:57:00 compute-1 sudo[193460]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:57:00 compute-1 python3.9[193462]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769439419.811445-1082-182802603361743/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:57:00 compute-1 sudo[193460]: pam_unix(sudo:session): session closed for user root
Jan 26 14:57:01 compute-1 sudo[193612]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjsdwvftdkhkxfggmygdnqpticgqvbmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439421.5764208-1124-249008055409055/AnsiballZ_file.py'
Jan 26 14:57:01 compute-1 sudo[193612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:57:02 compute-1 python3.9[193614]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:57:02 compute-1 sudo[193612]: pam_unix(sudo:session): session closed for user root
Jan 26 14:57:02 compute-1 sudo[193764]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lejfpsbjosqtwlnlxamcjsirzurgibdi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439422.24981-1140-19660174541807/AnsiballZ_file.py'
Jan 26 14:57:02 compute-1 sudo[193764]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:57:02 compute-1 python3.9[193766]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 14:57:02 compute-1 sudo[193764]: pam_unix(sudo:session): session closed for user root
Jan 26 14:57:03 compute-1 python3.9[193916]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:57:05 compute-1 sudo[194367]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uikqupikejurimecqvmxlwosfneabnbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439425.106966-1208-94765392849180/AnsiballZ_container_config_data.py'
Jan 26 14:57:05 compute-1 sudo[194367]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:57:05 compute-1 podman[194312]: 2026-01-26 14:57:05.413204094 +0000 UTC m=+0.048846144 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 26 14:57:05 compute-1 podman[194311]: 2026-01-26 14:57:05.454504682 +0000 UTC m=+0.091976515 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, config_id=ovn_controller, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 26 14:57:05 compute-1 python3.9[194374]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter config_pattern=*.json debug=False
Jan 26 14:57:05 compute-1 sudo[194367]: pam_unix(sudo:session): session closed for user root
Jan 26 14:57:06 compute-1 sudo[194530]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bysxywhrgygwlgnwkdjdzrxwupvyygyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439426.0342262-1230-28870158166094/AnsiballZ_container_config_hash.py'
Jan 26 14:57:06 compute-1 sudo[194530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:57:06 compute-1 python3.9[194532]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 26 14:57:06 compute-1 sudo[194530]: pam_unix(sudo:session): session closed for user root
Jan 26 14:57:07 compute-1 sudo[194682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uzpofbxjuirtcbqdplcfphvfkvnmfywh ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769439426.915943-1250-103040891123983/AnsiballZ_edpm_container_manage.py'
Jan 26 14:57:07 compute-1 sudo[194682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:57:07 compute-1 python3[194684]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/openstack_network_exporter config_id=openstack_network_exporter config_overrides={} config_patterns=*.json containers=['openstack_network_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Jan 26 14:57:09 compute-1 podman[194697]: 2026-01-26 14:57:09.788628522 +0000 UTC m=+2.211926910 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Jan 26 14:57:09 compute-1 podman[194794]: 2026-01-26 14:57:09.989548273 +0000 UTC m=+0.095294739 container create 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., distribution-scope=public, version=9.6, release=1755695350, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, vcs-type=git, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=edpm_ansible, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 26 14:57:09 compute-1 podman[194794]: 2026-01-26 14:57:09.920549636 +0000 UTC m=+0.026296082 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Jan 26 14:57:09 compute-1 python3[194684]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8 --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=openstack_network_exporter --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Jan 26 14:57:10 compute-1 sudo[194682]: pam_unix(sudo:session): session closed for user root
Jan 26 14:57:15 compute-1 sudo[194982]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmtktdtenzsvjqznpxzybzufvuacqsxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439435.7502506-1266-121802099213019/AnsiballZ_stat.py'
Jan 26 14:57:16 compute-1 sudo[194982]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:57:16 compute-1 python3.9[194984]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 14:57:16 compute-1 sudo[194982]: pam_unix(sudo:session): session closed for user root
Jan 26 14:57:16 compute-1 sshd-session[194711]: Invalid user ftp_inst from 185.246.128.170 port 61178
Jan 26 14:57:16 compute-1 sudo[195136]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxioizlicqefxapssdycegbjabkgnvav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439436.4836972-1284-198037764269077/AnsiballZ_file.py'
Jan 26 14:57:16 compute-1 sudo[195136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:57:16 compute-1 sshd-session[194711]: Disconnecting invalid user ftp_inst 185.246.128.170 port 61178: Change of username or service not allowed: (ftp_inst,ssh-connection) -> (user,ssh-connection) [preauth]
Jan 26 14:57:17 compute-1 python3.9[195138]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:57:17 compute-1 sudo[195136]: pam_unix(sudo:session): session closed for user root
Jan 26 14:57:17 compute-1 sudo[195212]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kguqkiqxzewcxycogoacxtzqvuxdkvln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439436.4836972-1284-198037764269077/AnsiballZ_stat.py'
Jan 26 14:57:17 compute-1 sudo[195212]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:57:17 compute-1 python3.9[195214]: ansible-stat Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 14:57:17 compute-1 sudo[195212]: pam_unix(sudo:session): session closed for user root
Jan 26 14:57:18 compute-1 sudo[195363]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xplqnudylnuanstueclwpmtloohnobyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439437.7897675-1284-163594626783215/AnsiballZ_copy.py'
Jan 26 14:57:18 compute-1 sudo[195363]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:57:18 compute-1 python3.9[195365]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769439437.7897675-1284-163594626783215/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:57:18 compute-1 sudo[195363]: pam_unix(sudo:session): session closed for user root
Jan 26 14:57:18 compute-1 sudo[195439]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbjpzgpwvjdlempyclmcfddpymrvrdix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439437.7897675-1284-163594626783215/AnsiballZ_systemd.py'
Jan 26 14:57:18 compute-1 sudo[195439]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:57:19 compute-1 python3.9[195441]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 14:57:19 compute-1 systemd[1]: Reloading.
Jan 26 14:57:19 compute-1 systemd-rc-local-generator[195469]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:57:19 compute-1 systemd-sysv-generator[195472]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 14:57:19 compute-1 sudo[195439]: pam_unix(sudo:session): session closed for user root
Jan 26 14:57:19 compute-1 sudo[195550]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwsnnvbswyxmjxdopukamlyvznvpssfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439437.7897675-1284-163594626783215/AnsiballZ_systemd.py'
Jan 26 14:57:19 compute-1 sudo[195550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:57:20 compute-1 python3.9[195553]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 14:57:20 compute-1 systemd[1]: Reloading.
Jan 26 14:57:20 compute-1 systemd-sysv-generator[195579]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 14:57:20 compute-1 systemd-rc-local-generator[195576]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 14:57:20 compute-1 auditd[706]: Audit daemon rotating log files
Jan 26 14:57:20 compute-1 systemd[1]: Starting openstack_network_exporter container...
Jan 26 14:57:22 compute-1 systemd[1]: Started libcrun container.
Jan 26 14:57:22 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2d9d26749a2b544bcb3b1b551f80620480a34115aa5d118ef936cdb5b22e5198/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Jan 26 14:57:22 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2d9d26749a2b544bcb3b1b551f80620480a34115aa5d118ef936cdb5b22e5198/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Jan 26 14:57:22 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2d9d26749a2b544bcb3b1b551f80620480a34115aa5d118ef936cdb5b22e5198/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Jan 26 14:57:22 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5.
Jan 26 14:57:23 compute-1 podman[195594]: 2026-01-26 14:57:23.638704617 +0000 UTC m=+3.041776521 container init 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., io.buildah.version=1.33.7, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, version=9.6, config_id=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 26 14:57:23 compute-1 openstack_network_exporter[195610]: INFO    14:57:23 main.go:48: registering *bridge.Collector
Jan 26 14:57:23 compute-1 openstack_network_exporter[195610]: INFO    14:57:23 main.go:48: registering *coverage.Collector
Jan 26 14:57:23 compute-1 openstack_network_exporter[195610]: INFO    14:57:23 main.go:48: registering *datapath.Collector
Jan 26 14:57:23 compute-1 openstack_network_exporter[195610]: INFO    14:57:23 main.go:48: registering *iface.Collector
Jan 26 14:57:23 compute-1 openstack_network_exporter[195610]: INFO    14:57:23 main.go:48: registering *memory.Collector
Jan 26 14:57:23 compute-1 openstack_network_exporter[195610]: INFO    14:57:23 main.go:55: *ovnnorthd.Collector not registered, metric set not enabled
Jan 26 14:57:23 compute-1 openstack_network_exporter[195610]: INFO    14:57:23 main.go:48: registering *ovn.Collector
Jan 26 14:57:23 compute-1 openstack_network_exporter[195610]: INFO    14:57:23 main.go:55: *ovsdbserver.Collector not registered, metric set not enabled
Jan 26 14:57:23 compute-1 openstack_network_exporter[195610]: INFO    14:57:23 main.go:48: registering *pmd_perf.Collector
Jan 26 14:57:23 compute-1 openstack_network_exporter[195610]: INFO    14:57:23 main.go:48: registering *pmd_rxq.Collector
Jan 26 14:57:23 compute-1 openstack_network_exporter[195610]: INFO    14:57:23 main.go:48: registering *vswitch.Collector
Jan 26 14:57:23 compute-1 openstack_network_exporter[195610]: NOTICE  14:57:23 main.go:76: listening on https://:9105/metrics
Jan 26 14:57:23 compute-1 podman[195594]: 2026-01-26 14:57:23.670805132 +0000 UTC m=+3.073877026 container start 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, managed_by=edpm_ansible, version=9.6, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, vcs-type=git, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., release=1755695350)
Jan 26 14:57:23 compute-1 podman[195594]: openstack_network_exporter
Jan 26 14:57:23 compute-1 systemd[1]: Started openstack_network_exporter container.
Jan 26 14:57:23 compute-1 sudo[195550]: pam_unix(sudo:session): session closed for user root
Jan 26 14:57:24 compute-1 podman[195620]: 2026-01-26 14:57:24.027105335 +0000 UTC m=+0.342307645 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., distribution-scope=public, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, architecture=x86_64, name=ubi9-minimal, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 26 14:57:24 compute-1 python3.9[195792]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 26 14:57:25 compute-1 sudo[195942]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plypqailrcfvbuymwrzhdddmcaaadebh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439445.3708584-1374-61387414154328/AnsiballZ_stat.py'
Jan 26 14:57:25 compute-1 sudo[195942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:57:25 compute-1 python3.9[195944]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:57:25 compute-1 sudo[195942]: pam_unix(sudo:session): session closed for user root
Jan 26 14:57:26 compute-1 sudo[196067]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eemurdvztsvupbkfjchxasxzecgdcmyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439445.3708584-1374-61387414154328/AnsiballZ_copy.py'
Jan 26 14:57:26 compute-1 sudo[196067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:57:26 compute-1 podman[196069]: 2026-01-26 14:57:26.41590216 +0000 UTC m=+0.057312545 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 26 14:57:26 compute-1 python3.9[196070]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769439445.3708584-1374-61387414154328/.source.yaml _original_basename=.9yd_penx follow=False checksum=94e98258a9ed42ec4948fd1ff4bdb10d3b47fdf0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:57:26 compute-1 sudo[196067]: pam_unix(sudo:session): session closed for user root
Jan 26 14:57:26 compute-1 sshd-session[195552]: Invalid user user from 185.246.128.170 port 35027
Jan 26 14:57:27 compute-1 sudo[196243]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amlbegspjcuxlqxzaantusglipugkagl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439446.747712-1404-253643715252493/AnsiballZ_find.py'
Jan 26 14:57:27 compute-1 sudo[196243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:57:27 compute-1 python3.9[196245]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 26 14:57:27 compute-1 sudo[196243]: pam_unix(sudo:session): session closed for user root
Jan 26 14:57:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:57:29.009 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 14:57:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:57:29.010 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 14:57:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:57:29.010 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 14:57:35 compute-1 podman[196272]: 2026-01-26 14:57:35.876261626 +0000 UTC m=+0.055461454 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 26 14:57:35 compute-1 podman[196271]: 2026-01-26 14:57:35.949476014 +0000 UTC m=+0.121446226 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 14:57:40 compute-1 sshd-session[195552]: error: maximum authentication attempts exceeded for invalid user user from 185.246.128.170 port 35027 ssh2 [preauth]
Jan 26 14:57:40 compute-1 sshd-session[195552]: Disconnecting invalid user user 185.246.128.170 port 35027: Too many authentication failures [preauth]
Jan 26 14:57:44 compute-1 sshd-session[196317]: Invalid user user from 185.246.128.170 port 34827
Jan 26 14:57:44 compute-1 sudo[196444]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldotrqwpjddkhyoweemnqypaknykhals ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439464.6126423-1554-66751084220400/AnsiballZ_podman_container_info.py'
Jan 26 14:57:44 compute-1 sudo[196444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:57:45 compute-1 python3.9[196446]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Jan 26 14:57:45 compute-1 sshd-session[196317]: Disconnecting invalid user user 185.246.128.170 port 34827: Change of username or service not allowed: (user,ssh-connection) -> (momoru,ssh-connection) [preauth]
Jan 26 14:57:45 compute-1 sudo[196444]: pam_unix(sudo:session): session closed for user root
Jan 26 14:57:45 compute-1 sudo[196609]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cablnqqgbpdnzsnqeggnxzlzgcwwjbtm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439465.3976796-1562-227309720919796/AnsiballZ_podman_container_exec.py'
Jan 26 14:57:45 compute-1 sudo[196609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:57:45 compute-1 python3.9[196611]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 26 14:57:46 compute-1 systemd[1]: Started libpod-conmon-b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1.scope.
Jan 26 14:57:46 compute-1 podman[196614]: 2026-01-26 14:57:46.836593615 +0000 UTC m=+0.865116376 container exec b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 14:57:47 compute-1 podman[196633]: 2026-01-26 14:57:47.009542758 +0000 UTC m=+0.145322197 container exec_died b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible)
Jan 26 14:57:47 compute-1 podman[196614]: 2026-01-26 14:57:47.053581904 +0000 UTC m=+1.082104645 container exec_died b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 14:57:47 compute-1 systemd[1]: libpod-conmon-b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1.scope: Deactivated successfully.
Jan 26 14:57:47 compute-1 sudo[196609]: pam_unix(sudo:session): session closed for user root
Jan 26 14:57:47 compute-1 sudo[196795]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfuflyxgagxlnrcmfzjtjwvtveydjyil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439467.4735146-1570-94640964453416/AnsiballZ_podman_container_exec.py'
Jan 26 14:57:47 compute-1 sudo[196795]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:57:47 compute-1 sshd-session[196612]: Invalid user momoru from 185.246.128.170 port 42844
Jan 26 14:57:48 compute-1 python3.9[196797]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 26 14:57:48 compute-1 sshd-session[196612]: Disconnecting invalid user momoru 185.246.128.170 port 42844: Change of username or service not allowed: (momoru,ssh-connection) -> (rafael,ssh-connection) [preauth]
Jan 26 14:57:48 compute-1 systemd[1]: Started libpod-conmon-b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1.scope.
Jan 26 14:57:48 compute-1 podman[196798]: 2026-01-26 14:57:48.63008607 +0000 UTC m=+0.581093595 container exec b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260120, tcib_managed=true, config_id=ovn_controller, tcib_build_tag=watcher_latest, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 26 14:57:48 compute-1 podman[196798]: 2026-01-26 14:57:48.927988297 +0000 UTC m=+0.878995822 container exec_died b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 26 14:57:49 compute-1 sudo[196795]: pam_unix(sudo:session): session closed for user root
Jan 26 14:57:49 compute-1 systemd[1]: libpod-conmon-b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1.scope: Deactivated successfully.
Jan 26 14:57:49 compute-1 sudo[196980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skokivdrxwqpezdrxjrcazxzmpwndfsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439469.5278997-1578-48482403382270/AnsiballZ_file.py'
Jan 26 14:57:49 compute-1 sudo[196980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:57:49 compute-1 python3.9[196982]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:57:50 compute-1 sudo[196980]: pam_unix(sudo:session): session closed for user root
Jan 26 14:57:50 compute-1 sudo[197133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvxflpfjmucqzhmirsbnjlaqekmwyhrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439470.2313478-1587-141507096221717/AnsiballZ_podman_container_info.py'
Jan 26 14:57:50 compute-1 sudo[197133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:57:50 compute-1 python3.9[197135]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Jan 26 14:57:50 compute-1 sudo[197133]: pam_unix(sudo:session): session closed for user root
Jan 26 14:57:51 compute-1 sudo[197298]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxbgveinitkludavtqaeqdtmhaiihsxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439471.0343378-1595-144697319679038/AnsiballZ_podman_container_exec.py'
Jan 26 14:57:51 compute-1 sudo[197298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:57:51 compute-1 python3.9[197300]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 26 14:57:51 compute-1 sshd-session[196827]: Invalid user rafael from 185.246.128.170 port 24295
Jan 26 14:57:52 compute-1 sshd-session[196827]: Disconnecting invalid user rafael 185.246.128.170 port 24295: Change of username or service not allowed: (rafael,ssh-connection) -> (api,ssh-connection) [preauth]
Jan 26 14:57:52 compute-1 systemd[1]: Started libpod-conmon-bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20.scope.
Jan 26 14:57:52 compute-1 podman[197301]: 2026-01-26 14:57:52.386985648 +0000 UTC m=+0.809746568 container exec bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260120, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 26 14:57:52 compute-1 podman[197301]: 2026-01-26 14:57:52.423674614 +0000 UTC m=+0.846435424 container exec_died bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260120, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent)
Jan 26 14:57:53 compute-1 sudo[197298]: pam_unix(sudo:session): session closed for user root
Jan 26 14:57:53 compute-1 systemd[1]: libpod-conmon-bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20.scope: Deactivated successfully.
Jan 26 14:57:53 compute-1 sudo[197482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkoryxnbtmaokufvzzajkgurnhtjjuhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439473.5540464-1603-186419806581453/AnsiballZ_podman_container_exec.py'
Jan 26 14:57:53 compute-1 sudo[197482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:57:54 compute-1 python3.9[197484]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 26 14:57:54 compute-1 nova_compute[183403]: 2026-01-26 14:57:54.635 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 14:57:54 compute-1 nova_compute[183403]: 2026-01-26 14:57:54.637 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 14:57:54 compute-1 systemd[1]: Started libpod-conmon-bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20.scope.
Jan 26 14:57:54 compute-1 podman[197485]: 2026-01-26 14:57:54.91695671 +0000 UTC m=+0.849741508 container exec bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2)
Jan 26 14:57:55 compute-1 nova_compute[183403]: 2026-01-26 14:57:55.292 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 14:57:55 compute-1 nova_compute[183403]: 2026-01-26 14:57:55.292 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 14:57:55 compute-1 nova_compute[183403]: 2026-01-26 14:57:55.292 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 14:57:55 compute-1 nova_compute[183403]: 2026-01-26 14:57:55.293 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 14:57:55 compute-1 nova_compute[183403]: 2026-01-26 14:57:55.294 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 14:57:55 compute-1 nova_compute[183403]: 2026-01-26 14:57:55.295 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 14:57:55 compute-1 nova_compute[183403]: 2026-01-26 14:57:55.295 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 14:57:55 compute-1 nova_compute[183403]: 2026-01-26 14:57:55.295 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 14:57:55 compute-1 podman[197498]: 2026-01-26 14:57:55.40748419 +0000 UTC m=+0.571194343 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, container_name=openstack_network_exporter, io.buildah.version=1.33.7, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container)
Jan 26 14:57:55 compute-1 podman[197485]: 2026-01-26 14:57:55.42112365 +0000 UTC m=+1.353908428 container exec_died bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 26 14:57:55 compute-1 nova_compute[183403]: 2026-01-26 14:57:55.810 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 14:57:55 compute-1 nova_compute[183403]: 2026-01-26 14:57:55.812 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 14:57:55 compute-1 nova_compute[183403]: 2026-01-26 14:57:55.812 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 14:57:55 compute-1 nova_compute[183403]: 2026-01-26 14:57:55.812 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 14:57:55 compute-1 sudo[197482]: pam_unix(sudo:session): session closed for user root
Jan 26 14:57:55 compute-1 systemd[1]: libpod-conmon-bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20.scope: Deactivated successfully.
Jan 26 14:57:55 compute-1 nova_compute[183403]: 2026-01-26 14:57:55.973 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 14:57:55 compute-1 nova_compute[183403]: 2026-01-26 14:57:55.974 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 14:57:55 compute-1 nova_compute[183403]: 2026-01-26 14:57:55.998 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.023s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 14:57:55 compute-1 nova_compute[183403]: 2026-01-26 14:57:55.998 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5977MB free_disk=73.1856689453125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 14:57:55 compute-1 nova_compute[183403]: 2026-01-26 14:57:55.998 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 14:57:55 compute-1 nova_compute[183403]: 2026-01-26 14:57:55.999 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 14:57:56 compute-1 sudo[197689]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wavzjjaiumxatqgtfbybushikyojkzvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439476.0109856-1611-91596595251203/AnsiballZ_file.py'
Jan 26 14:57:56 compute-1 sudo[197689]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:57:56 compute-1 python3.9[197691]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:57:56 compute-1 sudo[197689]: pam_unix(sudo:session): session closed for user root
Jan 26 14:57:56 compute-1 podman[197752]: 2026-01-26 14:57:56.91148909 +0000 UTC m=+0.080562977 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 14:57:57 compute-1 nova_compute[183403]: 2026-01-26 14:57:57.077 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 14:57:57 compute-1 nova_compute[183403]: 2026-01-26 14:57:57.077 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:57:55 up 53 min,  0 user,  load average: 0.47, 0.75, 0.70\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 14:57:57 compute-1 nova_compute[183403]: 2026-01-26 14:57:57.095 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 14:57:57 compute-1 sudo[197866]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mpntwxtfgmruozzwhfjhtrgjmfagujlz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439476.7591727-1620-172262627893376/AnsiballZ_podman_container_info.py'
Jan 26 14:57:57 compute-1 sudo[197866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:57:57 compute-1 python3.9[197868]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Jan 26 14:57:57 compute-1 sudo[197866]: pam_unix(sudo:session): session closed for user root
Jan 26 14:57:57 compute-1 nova_compute[183403]: 2026-01-26 14:57:57.607 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 14:57:58 compute-1 sudo[198033]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upnsvylnpimbwklbvlbivaqxueeibirq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439477.7917852-1628-179570329578701/AnsiballZ_podman_container_exec.py'
Jan 26 14:57:58 compute-1 sudo[198033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:57:58 compute-1 nova_compute[183403]: 2026-01-26 14:57:58.124 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 14:57:58 compute-1 nova_compute[183403]: 2026-01-26 14:57:58.124 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.125s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 14:57:58 compute-1 python3.9[198035]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 26 14:57:58 compute-1 systemd[1]: Started libpod-conmon-46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f.scope.
Jan 26 14:57:58 compute-1 podman[198036]: 2026-01-26 14:57:58.649105396 +0000 UTC m=+0.360158567 container exec 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 14:57:58 compute-1 podman[198056]: 2026-01-26 14:57:58.820577331 +0000 UTC m=+0.158423102 container exec_died 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 26 14:57:59 compute-1 podman[198036]: 2026-01-26 14:57:59.33888498 +0000 UTC m=+1.049938151 container exec_died 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 14:57:59 compute-1 systemd[1]: libpod-conmon-46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f.scope: Deactivated successfully.
Jan 26 14:57:59 compute-1 sudo[198033]: pam_unix(sudo:session): session closed for user root
Jan 26 14:58:00 compute-1 sudo[198219]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnpfwazrlppdkpktldwleeqeaxhrsgch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439479.886288-1636-140355350674276/AnsiballZ_podman_container_exec.py'
Jan 26 14:58:00 compute-1 sudo[198219]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:58:00 compute-1 sshd-session[197906]: Invalid user api from 185.246.128.170 port 56981
Jan 26 14:58:00 compute-1 sshd-session[197906]: Disconnecting invalid user api 185.246.128.170 port 56981: Change of username or service not allowed: (api,ssh-connection) -> (tmax,ssh-connection) [preauth]
Jan 26 14:58:00 compute-1 python3.9[198221]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 26 14:58:01 compute-1 systemd[1]: Started libpod-conmon-46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f.scope.
Jan 26 14:58:01 compute-1 podman[198222]: 2026-01-26 14:58:01.081203604 +0000 UTC m=+0.528119616 container exec 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 26 14:58:01 compute-1 podman[198222]: 2026-01-26 14:58:01.250639993 +0000 UTC m=+0.697555955 container exec_died 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 14:58:01 compute-1 sudo[198219]: pam_unix(sudo:session): session closed for user root
Jan 26 14:58:01 compute-1 systemd[1]: libpod-conmon-46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f.scope: Deactivated successfully.
Jan 26 14:58:01 compute-1 sudo[198401]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fehtuutlodralnrdbcmsecanvffmlhap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439481.5149577-1644-2724835511384/AnsiballZ_file.py'
Jan 26 14:58:01 compute-1 sudo[198401]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:58:01 compute-1 python3.9[198403]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:58:02 compute-1 sudo[198401]: pam_unix(sudo:session): session closed for user root
Jan 26 14:58:02 compute-1 sudo[198554]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dekfvotkzhttcvxstrktfknapjyisvue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439482.3079922-1653-95628408484110/AnsiballZ_podman_container_info.py'
Jan 26 14:58:02 compute-1 sudo[198554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:58:02 compute-1 python3.9[198556]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Jan 26 14:58:02 compute-1 sudo[198554]: pam_unix(sudo:session): session closed for user root
Jan 26 14:58:03 compute-1 sudo[198719]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qjuzfurygdbkvszepprqgdnesjxukqhl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439483.045862-1661-161064927065932/AnsiballZ_podman_container_exec.py'
Jan 26 14:58:03 compute-1 sudo[198719]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:58:03 compute-1 python3.9[198721]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 26 14:58:03 compute-1 systemd[1]: Started libpod-conmon-90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5.scope.
Jan 26 14:58:03 compute-1 podman[198722]: 2026-01-26 14:58:03.947358824 +0000 UTC m=+0.349951540 container exec 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, name=ubi9-minimal, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.openshift.expose-services=)
Jan 26 14:58:04 compute-1 podman[198742]: 2026-01-26 14:58:04.076487449 +0000 UTC m=+0.110724456 container exec_died 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, release=1755695350, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, version=9.6, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-type=git)
Jan 26 14:58:04 compute-1 podman[198722]: 2026-01-26 14:58:04.160904741 +0000 UTC m=+0.563497457 container exec_died 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=)
Jan 26 14:58:04 compute-1 systemd[1]: libpod-conmon-90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5.scope: Deactivated successfully.
Jan 26 14:58:04 compute-1 sudo[198719]: pam_unix(sudo:session): session closed for user root
Jan 26 14:58:04 compute-1 sudo[198904]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-joqqkhfkbmgtqrxtpfrbwsjyfnsgrqiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439484.4617274-1669-121608850921652/AnsiballZ_podman_container_exec.py'
Jan 26 14:58:04 compute-1 sudo[198904]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:58:05 compute-1 python3.9[198906]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 26 14:58:05 compute-1 systemd[1]: Started libpod-conmon-90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5.scope.
Jan 26 14:58:05 compute-1 podman[198907]: 2026-01-26 14:58:05.506777444 +0000 UTC m=+0.445332369 container exec 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.openshift.expose-services=, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, build-date=2025-08-20T13:12:41, architecture=x86_64, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.buildah.version=1.33.7, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 26 14:58:05 compute-1 podman[198926]: 2026-01-26 14:58:05.654444322 +0000 UTC m=+0.133982277 container exec_died 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, architecture=x86_64, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, version=9.6, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, name=ubi9-minimal, io.openshift.tags=minimal rhel9, release=1755695350, vcs-type=git)
Jan 26 14:58:06 compute-1 podman[198907]: 2026-01-26 14:58:06.084096585 +0000 UTC m=+1.022651500 container exec_died 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.buildah.version=1.33.7, release=1755695350, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, container_name=openstack_network_exporter)
Jan 26 14:58:06 compute-1 systemd[1]: libpod-conmon-90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5.scope: Deactivated successfully.
Jan 26 14:58:06 compute-1 sudo[198904]: pam_unix(sudo:session): session closed for user root
Jan 26 14:58:06 compute-1 podman[198941]: 2026-01-26 14:58:06.62691994 +0000 UTC m=+0.502910172 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 26 14:58:06 compute-1 podman[198940]: 2026-01-26 14:58:06.676092054 +0000 UTC m=+0.560327340 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260120)
Jan 26 14:58:07 compute-1 sudo[199134]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fyrjxgcjacyvymlqbnlvmkjsximvjord ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439486.7455006-1677-267027546504885/AnsiballZ_file.py'
Jan 26 14:58:07 compute-1 sudo[199134]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:58:07 compute-1 python3.9[199136]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:58:07 compute-1 sudo[199134]: pam_unix(sudo:session): session closed for user root
Jan 26 14:58:07 compute-1 sudo[199286]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pujezpmrizllpfxqdetitczgjvcpuekw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439487.5663583-1688-172894982012308/AnsiballZ_file.py'
Jan 26 14:58:07 compute-1 sudo[199286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:58:08 compute-1 python3.9[199288]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:58:08 compute-1 sudo[199286]: pam_unix(sudo:session): session closed for user root
Jan 26 14:58:08 compute-1 sudo[199438]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-arlligkzfttmadpazshfvtuvcpbigtej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439488.2683458-1705-45804971385015/AnsiballZ_stat.py'
Jan 26 14:58:08 compute-1 sudo[199438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:58:08 compute-1 python3.9[199440]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:58:08 compute-1 sudo[199438]: pam_unix(sudo:session): session closed for user root
Jan 26 14:58:09 compute-1 sudo[199561]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qjgllxiyhcbautygflzclulncedcwchh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439488.2683458-1705-45804971385015/AnsiballZ_copy.py'
Jan 26 14:58:09 compute-1 sudo[199561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:58:09 compute-1 python3.9[199563]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769439488.2683458-1705-45804971385015/.source.yaml _original_basename=firewall.yaml follow=False checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:58:09 compute-1 sudo[199561]: pam_unix(sudo:session): session closed for user root
Jan 26 14:58:09 compute-1 sshd-session[198428]: Invalid user tmax from 185.246.128.170 port 36127
Jan 26 14:58:10 compute-1 sudo[199713]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-erswporsxqtnmwphbkwodfzhfmromosz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439489.8451395-1736-104707351011970/AnsiballZ_file.py'
Jan 26 14:58:10 compute-1 sudo[199713]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:58:10 compute-1 python3.9[199715]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:58:10 compute-1 sudo[199713]: pam_unix(sudo:session): session closed for user root
Jan 26 14:58:10 compute-1 sudo[199865]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lnyufcginvmvlacfeyspzesmzigikhei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439490.586858-1752-162525275911576/AnsiballZ_stat.py'
Jan 26 14:58:10 compute-1 sudo[199865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:58:11 compute-1 python3.9[199867]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:58:11 compute-1 sudo[199865]: pam_unix(sudo:session): session closed for user root
Jan 26 14:58:11 compute-1 sshd-session[198428]: Disconnecting invalid user tmax 185.246.128.170 port 36127: Change of username or service not allowed: (tmax,ssh-connection) -> (user01,ssh-connection) [preauth]
Jan 26 14:58:11 compute-1 sudo[199943]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udwnnzhbggqfdothwoqjshrogejjkabo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439490.586858-1752-162525275911576/AnsiballZ_file.py'
Jan 26 14:58:11 compute-1 sudo[199943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:58:11 compute-1 python3.9[199945]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:58:11 compute-1 sudo[199943]: pam_unix(sudo:session): session closed for user root
Jan 26 14:58:12 compute-1 sudo[200096]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gryayaxlgujnistlmksowaatsbrybchf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439492.0962145-1776-268680322696228/AnsiballZ_stat.py'
Jan 26 14:58:12 compute-1 sudo[200096]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:58:12 compute-1 python3.9[200098]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:58:12 compute-1 sudo[200096]: pam_unix(sudo:session): session closed for user root
Jan 26 14:58:12 compute-1 sudo[200175]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbthmulxiskaugfcqjbgvriynfcvxstj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439492.0962145-1776-268680322696228/AnsiballZ_file.py'
Jan 26 14:58:12 compute-1 sudo[200175]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:58:13 compute-1 python3.9[200177]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.6leuu01a recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:58:13 compute-1 sudo[200175]: pam_unix(sudo:session): session closed for user root
Jan 26 14:58:13 compute-1 sudo[200327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqjezfpbuctsthdcmxypcbmfevvyzknw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439493.4372153-1800-206522193231730/AnsiballZ_stat.py'
Jan 26 14:58:13 compute-1 sudo[200327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:58:14 compute-1 python3.9[200329]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:58:14 compute-1 sudo[200327]: pam_unix(sudo:session): session closed for user root
Jan 26 14:58:14 compute-1 sudo[200405]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nivfyjprlylkkabjamyjnontmjxbqjdv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439493.4372153-1800-206522193231730/AnsiballZ_file.py'
Jan 26 14:58:14 compute-1 sudo[200405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:58:14 compute-1 python3.9[200407]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:58:14 compute-1 sudo[200405]: pam_unix(sudo:session): session closed for user root
Jan 26 14:58:15 compute-1 sudo[200557]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqrrpynkoovwjydsgwwyqfmmxmtfwiyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439494.9555485-1826-22065271725420/AnsiballZ_command.py'
Jan 26 14:58:15 compute-1 sudo[200557]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:58:15 compute-1 python3.9[200559]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:58:15 compute-1 sudo[200557]: pam_unix(sudo:session): session closed for user root
Jan 26 14:58:16 compute-1 sudo[200710]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvagudqncfdpacyyoibrupaiwgxpybjf ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769439495.760876-1842-194358507194148/AnsiballZ_edpm_nftables_from_files.py'
Jan 26 14:58:16 compute-1 sudo[200710]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:58:16 compute-1 sshd-session[199993]: Invalid user user01 from 185.246.128.170 port 55417
Jan 26 14:58:16 compute-1 python3[200712]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 26 14:58:16 compute-1 sudo[200710]: pam_unix(sudo:session): session closed for user root
Jan 26 14:58:16 compute-1 sshd-session[199993]: Disconnecting invalid user user01 185.246.128.170 port 55417: Change of username or service not allowed: (user01,ssh-connection) -> (root2,ssh-connection) [preauth]
Jan 26 14:58:16 compute-1 sudo[200862]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqfdqcobpqzchdbvimwxokqjagwnhvmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439496.5965014-1858-37822482955939/AnsiballZ_stat.py'
Jan 26 14:58:16 compute-1 sudo[200862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:58:17 compute-1 python3.9[200864]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:58:17 compute-1 sudo[200862]: pam_unix(sudo:session): session closed for user root
Jan 26 14:58:17 compute-1 sudo[200940]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ciohbopeogsvgllysmlrneixpsahlkao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439496.5965014-1858-37822482955939/AnsiballZ_file.py'
Jan 26 14:58:17 compute-1 sudo[200940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:58:17 compute-1 python3.9[200942]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:58:17 compute-1 sudo[200940]: pam_unix(sudo:session): session closed for user root
Jan 26 14:58:18 compute-1 sudo[201094]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrraiyrlyyovplbwerwmysegbqwzvtyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439497.9060237-1882-196181165434134/AnsiballZ_stat.py'
Jan 26 14:58:18 compute-1 sudo[201094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:58:18 compute-1 python3.9[201096]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:58:18 compute-1 sudo[201094]: pam_unix(sudo:session): session closed for user root
Jan 26 14:58:18 compute-1 sudo[201172]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvhleihtzonmwvxqnzqiikdppduunbdn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439497.9060237-1882-196181165434134/AnsiballZ_file.py'
Jan 26 14:58:18 compute-1 sudo[201172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:58:18 compute-1 sshd-session[200943]: Invalid user root2 from 185.246.128.170 port 20905
Jan 26 14:58:18 compute-1 python3.9[201174]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:58:18 compute-1 sudo[201172]: pam_unix(sudo:session): session closed for user root
Jan 26 14:58:18 compute-1 sshd-session[200943]: Disconnecting invalid user root2 185.246.128.170 port 20905: Change of username or service not allowed: (root2,ssh-connection) -> (openvswitch,ssh-connection) [preauth]
Jan 26 14:58:19 compute-1 sudo[201324]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-epqatijyvqxjnmduwhojqqcowmlpnsto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439499.191175-1906-210719119932120/AnsiballZ_stat.py'
Jan 26 14:58:19 compute-1 sudo[201324]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:58:19 compute-1 python3.9[201326]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:58:19 compute-1 sudo[201324]: pam_unix(sudo:session): session closed for user root
Jan 26 14:58:19 compute-1 sudo[201402]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gddbsfguocguesehszpfcodwepojrtbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439499.191175-1906-210719119932120/AnsiballZ_file.py'
Jan 26 14:58:19 compute-1 sudo[201402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:58:20 compute-1 python3.9[201404]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:58:20 compute-1 sudo[201402]: pam_unix(sudo:session): session closed for user root
Jan 26 14:58:20 compute-1 sudo[201554]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egacwsqbxoibwjjrgejzlrdcnnqsafqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439500.3685558-1930-197235156254388/AnsiballZ_stat.py'
Jan 26 14:58:20 compute-1 sudo[201554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:58:20 compute-1 python3.9[201556]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:58:20 compute-1 sudo[201554]: pam_unix(sudo:session): session closed for user root
Jan 26 14:58:21 compute-1 sudo[201632]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utldnrnwzlpdpiucefifilqclmoduymk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439500.3685558-1930-197235156254388/AnsiballZ_file.py'
Jan 26 14:58:21 compute-1 sudo[201632]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:58:21 compute-1 python3.9[201634]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:58:21 compute-1 sudo[201632]: pam_unix(sudo:session): session closed for user root
Jan 26 14:58:22 compute-1 sudo[201784]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjebizbqnueurfvoesgbfruhsqlknebt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439501.6669064-1955-36970819126448/AnsiballZ_stat.py'
Jan 26 14:58:22 compute-1 sudo[201784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:58:22 compute-1 python3.9[201786]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 14:58:22 compute-1 sudo[201784]: pam_unix(sudo:session): session closed for user root
Jan 26 14:58:22 compute-1 sudo[201909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zcotpudpsitsmhewcxushdmwnmyjpkfe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439501.6669064-1955-36970819126448/AnsiballZ_copy.py'
Jan 26 14:58:22 compute-1 sudo[201909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:58:22 compute-1 python3.9[201911]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769439501.6669064-1955-36970819126448/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:58:22 compute-1 sudo[201909]: pam_unix(sudo:session): session closed for user root
Jan 26 14:58:23 compute-1 sudo[202061]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wixypfslvjlnurctnzkzqntnyycboydc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439503.1121702-1984-244920336606160/AnsiballZ_file.py'
Jan 26 14:58:23 compute-1 sudo[202061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:58:23 compute-1 python3.9[202063]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:58:23 compute-1 sudo[202061]: pam_unix(sudo:session): session closed for user root
Jan 26 14:58:24 compute-1 sudo[202213]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfybpymqmovkeafbllzetevobxxkzzby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439503.8038046-2000-183990107611640/AnsiballZ_command.py'
Jan 26 14:58:24 compute-1 sudo[202213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:58:24 compute-1 python3.9[202216]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:58:24 compute-1 sudo[202213]: pam_unix(sudo:session): session closed for user root
Jan 26 14:58:24 compute-1 sudo[202369]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvoacmwvnvxapztjiimswnxumgvwlijq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439504.528623-2017-246496171576856/AnsiballZ_blockinfile.py'
Jan 26 14:58:24 compute-1 sudo[202369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:58:25 compute-1 python3.9[202371]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:58:25 compute-1 sudo[202369]: pam_unix(sudo:session): session closed for user root
Jan 26 14:58:25 compute-1 sudo[202536]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uskrkeewggcmvjnszstbspvuunomlgtg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439505.4208086-2035-16393065774570/AnsiballZ_command.py'
Jan 26 14:58:25 compute-1 sudo[202536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:58:25 compute-1 podman[202496]: 2026-01-26 14:58:25.728246623 +0000 UTC m=+0.057781070 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, version=9.6, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, distribution-scope=public, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 26 14:58:25 compute-1 python3.9[202546]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:58:25 compute-1 sudo[202536]: pam_unix(sudo:session): session closed for user root
Jan 26 14:58:26 compute-1 sudo[202697]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzwjcjipqweaorcchufplnvxzkwugund ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439506.1263332-2050-237400484199692/AnsiballZ_stat.py'
Jan 26 14:58:26 compute-1 sudo[202697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:58:26 compute-1 python3.9[202699]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 14:58:26 compute-1 sudo[202697]: pam_unix(sudo:session): session closed for user root
Jan 26 14:58:27 compute-1 podman[202825]: 2026-01-26 14:58:27.19769404 +0000 UTC m=+0.063151825 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 26 14:58:27 compute-1 sudo[202875]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjmfhnkggrglaauslxqxghlvtoakghsa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439506.9014683-2066-207647817114820/AnsiballZ_command.py'
Jan 26 14:58:27 compute-1 sudo[202875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:58:27 compute-1 python3.9[202877]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 14:58:27 compute-1 sudo[202875]: pam_unix(sudo:session): session closed for user root
Jan 26 14:58:27 compute-1 sshd-session[202215]: Disconnecting authenticating user openvswitch 185.246.128.170 port 27833: Change of username or service not allowed: (openvswitch,ssh-connection) -> (sapadm,ssh-connection) [preauth]
Jan 26 14:58:27 compute-1 sudo[203031]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xuuzxqvljuihghmthwnehrybsiiusfxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769439507.649312-2082-224959258766666/AnsiballZ_file.py'
Jan 26 14:58:27 compute-1 sudo[203031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 14:58:28 compute-1 python3.9[203033]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 14:58:28 compute-1 sudo[203031]: pam_unix(sudo:session): session closed for user root
Jan 26 14:58:28 compute-1 sshd-session[183733]: Connection closed by 192.168.122.30 port 59636
Jan 26 14:58:28 compute-1 sshd-session[183730]: pam_unix(sshd:session): session closed for user zuul
Jan 26 14:58:28 compute-1 systemd[1]: session-27.scope: Deactivated successfully.
Jan 26 14:58:28 compute-1 systemd[1]: session-27.scope: Consumed 1min 22.185s CPU time.
Jan 26 14:58:28 compute-1 systemd-logind[795]: Session 27 logged out. Waiting for processes to exit.
Jan 26 14:58:28 compute-1 systemd-logind[795]: Removed session 27.
Jan 26 14:58:29 compute-1 sshd-session[203034]: Invalid user sapadm from 185.246.128.170 port 37383
Jan 26 14:58:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:58:29.011 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 14:58:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:58:29.011 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 14:58:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:58:29.011 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 14:58:29 compute-1 sshd-session[203034]: Disconnecting invalid user sapadm 185.246.128.170 port 37383: Change of username or service not allowed: (sapadm,ssh-connection) -> (smart,ssh-connection) [preauth]
Jan 26 14:58:33 compute-1 sshd-session[203061]: Invalid user smart from 185.246.128.170 port 50121
Jan 26 14:58:35 compute-1 sshd-session[203061]: Disconnecting invalid user smart 185.246.128.170 port 50121: Change of username or service not allowed: (smart,ssh-connection) -> (cisco,ssh-connection) [preauth]
Jan 26 14:58:35 compute-1 podman[192725]: time="2026-01-26T14:58:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 14:58:35 compute-1 podman[192725]: @ - - [26/Jan/2026:14:58:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 14:58:35 compute-1 podman[192725]: @ - - [26/Jan/2026:14:58:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2143 "" "Go-http-client/1.1"
Jan 26 14:58:36 compute-1 podman[203064]: 2026-01-26 14:58:36.884426559 +0000 UTC m=+0.053133714 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260120, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 26 14:58:36 compute-1 podman[203063]: 2026-01-26 14:58:36.897603827 +0000 UTC m=+0.075784038 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Jan 26 14:58:49 compute-1 openstack_network_exporter[195610]: ERROR   14:58:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 14:58:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 14:58:49 compute-1 openstack_network_exporter[195610]: ERROR   14:58:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 14:58:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 14:58:55 compute-1 podman[203115]: 2026-01-26 14:58:55.875023656 +0000 UTC m=+0.057479781 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.6, name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, container_name=openstack_network_exporter, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41)
Jan 26 14:58:57 compute-1 sshd-session[203108]: Invalid user cisco from 185.246.128.170 port 31365
Jan 26 14:58:57 compute-1 podman[203136]: 2026-01-26 14:58:57.884641845 +0000 UTC m=+0.070139684 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 26 14:58:58 compute-1 nova_compute[183403]: 2026-01-26 14:58:58.126 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 14:58:58 compute-1 nova_compute[183403]: 2026-01-26 14:58:58.127 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 14:58:58 compute-1 nova_compute[183403]: 2026-01-26 14:58:58.127 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 14:58:58 compute-1 nova_compute[183403]: 2026-01-26 14:58:58.127 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 14:58:58 compute-1 nova_compute[183403]: 2026-01-26 14:58:58.128 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 14:58:58 compute-1 nova_compute[183403]: 2026-01-26 14:58:58.128 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 14:58:58 compute-1 nova_compute[183403]: 2026-01-26 14:58:58.128 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 14:58:58 compute-1 nova_compute[183403]: 2026-01-26 14:58:58.128 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 14:58:58 compute-1 nova_compute[183403]: 2026-01-26 14:58:58.128 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 14:58:58 compute-1 nova_compute[183403]: 2026-01-26 14:58:58.757 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 14:58:58 compute-1 nova_compute[183403]: 2026-01-26 14:58:58.758 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 14:58:58 compute-1 nova_compute[183403]: 2026-01-26 14:58:58.758 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 14:58:58 compute-1 nova_compute[183403]: 2026-01-26 14:58:58.758 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 14:58:58 compute-1 nova_compute[183403]: 2026-01-26 14:58:58.926 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 14:58:58 compute-1 nova_compute[183403]: 2026-01-26 14:58:58.928 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 14:58:58 compute-1 nova_compute[183403]: 2026-01-26 14:58:58.948 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.020s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 14:58:58 compute-1 nova_compute[183403]: 2026-01-26 14:58:58.949 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6100MB free_disk=73.1856689453125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 14:58:58 compute-1 nova_compute[183403]: 2026-01-26 14:58:58.949 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 14:58:58 compute-1 nova_compute[183403]: 2026-01-26 14:58:58.950 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 14:59:02 compute-1 nova_compute[183403]: 2026-01-26 14:59:02.449 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 14:59:02 compute-1 nova_compute[183403]: 2026-01-26 14:59:02.449 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:58:58 up 54 min,  0 user,  load average: 0.34, 0.67, 0.68\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 14:59:02 compute-1 nova_compute[183403]: 2026-01-26 14:59:02.467 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 14:59:03 compute-1 nova_compute[183403]: 2026-01-26 14:59:03.017 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 14:59:03 compute-1 nova_compute[183403]: 2026-01-26 14:59:03.614 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 14:59:03 compute-1 nova_compute[183403]: 2026-01-26 14:59:03.615 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.665s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 14:59:05 compute-1 podman[192725]: time="2026-01-26T14:59:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 14:59:05 compute-1 podman[192725]: @ - - [26/Jan/2026:14:59:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 14:59:05 compute-1 podman[192725]: @ - - [26/Jan/2026:14:59:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2149 "" "Go-http-client/1.1"
Jan 26 14:59:07 compute-1 podman[203161]: 2026-01-26 14:59:07.899080651 +0000 UTC m=+0.076508970 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest)
Jan 26 14:59:07 compute-1 podman[203160]: 2026-01-26 14:59:07.906710406 +0000 UTC m=+0.089362965 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 14:59:08 compute-1 sshd-session[203108]: Disconnecting invalid user cisco 185.246.128.170 port 31365: Change of username or service not allowed: (cisco,ssh-connection) -> (qaz,ssh-connection) [preauth]
Jan 26 14:59:16 compute-1 sshd-session[203204]: Invalid user qaz from 185.246.128.170 port 54351
Jan 26 14:59:16 compute-1 sshd-session[203204]: Disconnecting invalid user qaz 185.246.128.170 port 54351: Change of username or service not allowed: (qaz,ssh-connection) -> (webapp,ssh-connection) [preauth]
Jan 26 14:59:19 compute-1 openstack_network_exporter[195610]: ERROR   14:59:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 14:59:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 14:59:19 compute-1 openstack_network_exporter[195610]: ERROR   14:59:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 14:59:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 14:59:26 compute-1 podman[203209]: 2026-01-26 14:59:26.904026079 +0000 UTC m=+0.076582483 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, maintainer=Red Hat, Inc., vcs-type=git, vendor=Red Hat, Inc.)
Jan 26 14:59:27 compute-1 sshd-session[203206]: Invalid user webapp from 185.246.128.170 port 33411
Jan 26 14:59:28 compute-1 sshd-session[203206]: Disconnecting invalid user webapp 185.246.128.170 port 33411: Change of username or service not allowed: (webapp,ssh-connection) -> (user1,ssh-connection) [preauth]
Jan 26 14:59:28 compute-1 podman[203230]: 2026-01-26 14:59:28.861233507 +0000 UTC m=+0.047819761 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 14:59:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:59:29.012 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 14:59:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:59:29.012 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 14:59:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 14:59:29.012 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 14:59:31 compute-1 sshd-session[203255]: Invalid user user1 from 185.246.128.170 port 15354
Jan 26 14:59:35 compute-1 podman[192725]: time="2026-01-26T14:59:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 14:59:35 compute-1 podman[192725]: @ - - [26/Jan/2026:14:59:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 14:59:35 compute-1 podman[192725]: @ - - [26/Jan/2026:14:59:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2153 "" "Go-http-client/1.1"
Jan 26 14:59:38 compute-1 podman[203259]: 2026-01-26 14:59:38.879264764 +0000 UTC m=+0.048551772 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 26 14:59:38 compute-1 podman[203258]: 2026-01-26 14:59:38.902029093 +0000 UTC m=+0.083474527 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 14:59:40 compute-1 sshd-session[203255]: Disconnecting invalid user user1 185.246.128.170 port 15354: Change of username or service not allowed: (user1,ssh-connection) -> (prod,ssh-connection) [preauth]
Jan 26 14:59:47 compute-1 sshd-session[203300]: Invalid user prod from 185.246.128.170 port 56823
Jan 26 14:59:47 compute-1 sshd-session[203300]: Disconnecting invalid user prod 185.246.128.170 port 56823: Change of username or service not allowed: (prod,ssh-connection) -> (sftp_user,ssh-connection) [preauth]
Jan 26 14:59:49 compute-1 openstack_network_exporter[195610]: ERROR   14:59:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 14:59:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 14:59:49 compute-1 openstack_network_exporter[195610]: ERROR   14:59:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 14:59:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 14:59:53 compute-1 sshd-session[203302]: Invalid user sftp_user from 185.246.128.170 port 6644
Jan 26 14:59:53 compute-1 sshd-session[203302]: Disconnecting invalid user sftp_user 185.246.128.170 port 6644: Change of username or service not allowed: (sftp_user,ssh-connection) -> (mc1,ssh-connection) [preauth]
Jan 26 14:59:57 compute-1 podman[203304]: 2026-01-26 14:59:57.900517759 +0000 UTC m=+0.065961148 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, distribution-scope=public, managed_by=edpm_ansible, release=1755695350, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9)
Jan 26 14:59:59 compute-1 podman[203326]: 2026-01-26 14:59:59.881364461 +0000 UTC m=+0.061704114 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 26 15:00:01 compute-1 nova_compute[183403]: 2026-01-26 15:00:01.060 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:00:01 compute-1 nova_compute[183403]: 2026-01-26 15:00:01.060 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:00:01 compute-1 nova_compute[183403]: 2026-01-26 15:00:01.573 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:00:01 compute-1 nova_compute[183403]: 2026-01-26 15:00:01.574 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:00:01 compute-1 nova_compute[183403]: 2026-01-26 15:00:01.574 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:00:01 compute-1 nova_compute[183403]: 2026-01-26 15:00:01.574 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:00:01 compute-1 nova_compute[183403]: 2026-01-26 15:00:01.574 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:00:01 compute-1 nova_compute[183403]: 2026-01-26 15:00:01.574 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:00:01 compute-1 nova_compute[183403]: 2026-01-26 15:00:01.575 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 15:00:01 compute-1 nova_compute[183403]: 2026-01-26 15:00:01.575 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:00:02 compute-1 nova_compute[183403]: 2026-01-26 15:00:02.087 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:00:02 compute-1 nova_compute[183403]: 2026-01-26 15:00:02.087 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:00:02 compute-1 nova_compute[183403]: 2026-01-26 15:00:02.088 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:00:02 compute-1 nova_compute[183403]: 2026-01-26 15:00:02.088 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:00:02 compute-1 nova_compute[183403]: 2026-01-26 15:00:02.275 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:00:02 compute-1 nova_compute[183403]: 2026-01-26 15:00:02.276 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:00:02 compute-1 nova_compute[183403]: 2026-01-26 15:00:02.293 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:00:02 compute-1 nova_compute[183403]: 2026-01-26 15:00:02.294 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6179MB free_disk=73.18566513061523GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:00:02 compute-1 nova_compute[183403]: 2026-01-26 15:00:02.294 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:00:02 compute-1 nova_compute[183403]: 2026-01-26 15:00:02.295 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:00:03 compute-1 nova_compute[183403]: 2026-01-26 15:00:03.350 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:00:03 compute-1 nova_compute[183403]: 2026-01-26 15:00:03.351 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:00:02 up 55 min,  0 user,  load average: 0.12, 0.55, 0.63\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:00:03 compute-1 nova_compute[183403]: 2026-01-26 15:00:03.369 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:00:04 compute-1 nova_compute[183403]: 2026-01-26 15:00:04.407 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:00:04 compute-1 nova_compute[183403]: 2026-01-26 15:00:04.919 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:00:04 compute-1 nova_compute[183403]: 2026-01-26 15:00:04.919 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.624s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:00:05 compute-1 podman[192725]: time="2026-01-26T15:00:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:00:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:00:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:00:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:00:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2152 "" "Go-http-client/1.1"
Jan 26 15:00:09 compute-1 podman[203355]: 2026-01-26 15:00:09.88660239 +0000 UTC m=+0.062682119 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Jan 26 15:00:09 compute-1 podman[203354]: 2026-01-26 15:00:09.902195198 +0000 UTC m=+0.081062985 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0)
Jan 26 15:00:13 compute-1 sshd-session[203351]: Invalid user mc1 from 185.246.128.170 port 34944
Jan 26 15:00:13 compute-1 sshd-session[203351]: Disconnecting invalid user mc1 185.246.128.170 port 34944: Change of username or service not allowed: (mc1,ssh-connection) -> (devops,ssh-connection) [preauth]
Jan 26 15:00:20 compute-1 sshd-session[203401]: Invalid user devops from 185.246.128.170 port 46000
Jan 26 15:00:22 compute-1 sshd-session[203401]: Disconnecting invalid user devops 185.246.128.170 port 46000: Change of username or service not allowed: (devops,ssh-connection) -> (cloudera,ssh-connection) [preauth]
Jan 26 15:00:28 compute-1 podman[203405]: 2026-01-26 15:00:28.877614261 +0000 UTC m=+0.062280289 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, name=ubi9-minimal, architecture=x86_64, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, release=1755695350, version=9.6, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, distribution-scope=public, io.openshift.expose-services=)
Jan 26 15:00:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:00:29.013 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:00:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:00:29.014 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:00:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:00:29.014 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:00:30 compute-1 podman[203428]: 2026-01-26 15:00:30.880172681 +0000 UTC m=+0.052968772 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 15:00:32 compute-1 sshd-session[203403]: Invalid user cloudera from 185.246.128.170 port 53515
Jan 26 15:00:33 compute-1 sshd-session[203403]: Disconnecting invalid user cloudera 185.246.128.170 port 53515: Change of username or service not allowed: (cloudera,ssh-connection) -> (syncthing,ssh-connection) [preauth]
Jan 26 15:00:35 compute-1 podman[192725]: time="2026-01-26T15:00:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:00:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:00:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:00:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:00:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2149 "" "Go-http-client/1.1"
Jan 26 15:00:39 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:00:39.149 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:ac:38', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'de:96:40:90:f3:e5'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:00:39 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:00:39.149 104930 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 15:00:39 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:00:39.151 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=41380b5a-e321-4ce4-bcc6-ecd563b3c793, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:00:40 compute-1 podman[203457]: 2026-01-26 15:00:40.873952211 +0000 UTC m=+0.049076970 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Jan 26 15:00:40 compute-1 podman[203456]: 2026-01-26 15:00:40.949793698 +0000 UTC m=+0.122100198 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Jan 26 15:00:43 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:00:43.951 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9d:cd:e9 192.168.122.171'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.122.171/24', 'neutron:device_id': 'ovnmeta-626086fb-353b-4e7a-aef0-b1d433526b43', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-626086fb-353b-4e7a-aef0-b1d433526b43', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '179f3c996d8f4e7ea1b0aca3ec76f02e', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=359f7074-478a-4e31-844d-30d210b62883, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=da7e2d26-6906-48f6-9469-9e8ba6fe597d) old=Port_Binding(mac=['fa:16:3e:9d:cd:e9'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-626086fb-353b-4e7a-aef0-b1d433526b43', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-626086fb-353b-4e7a-aef0-b1d433526b43', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '179f3c996d8f4e7ea1b0aca3ec76f02e', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:00:43 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:00:43.952 104930 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port da7e2d26-6906-48f6-9469-9e8ba6fe597d in datapath 626086fb-353b-4e7a-aef0-b1d433526b43 updated
Jan 26 15:00:43 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:00:43.953 104930 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 626086fb-353b-4e7a-aef0-b1d433526b43, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 15:00:43 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:00:43.953 104930 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpomaquc_6/privsep.sock']
Jan 26 15:00:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:00:44.663 104930 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Jan 26 15:00:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:00:44.664 104930 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpomaquc_6/privsep.sock __init__ /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:377
Jan 26 15:00:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:00:44.527 203506 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 26 15:00:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:00:44.530 203506 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 26 15:00:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:00:44.531 203506 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Jan 26 15:00:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:00:44.532 203506 INFO oslo.privsep.daemon [-] privsep daemon running as pid 203506
Jan 26 15:00:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:00:44.665 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[a8b831a9-49f7-4994-8a9a-4747eda5494e]: (2,) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:00:45 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:00:45.126 203506 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:00:45 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:00:45.126 203506 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:00:45 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:00:45.126 203506 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:00:45 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:00:45.649 203506 INFO oslo_service.backend [-] Loading backend: eventlet
Jan 26 15:00:45 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:00:45.654 203506 INFO oslo_service.backend [-] Backend 'eventlet' successfully loaded and cached.
Jan 26 15:00:45 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:00:45.691 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[3b87056e-4072-49ce-be9a-1b8570d67f66]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:00:45 compute-1 sshd-session[203455]: Invalid user syncthing from 185.246.128.170 port 57130
Jan 26 15:00:46 compute-1 sshd-session[203455]: Disconnecting invalid user syncthing 185.246.128.170 port 57130: Change of username or service not allowed: (syncthing,ssh-connection) -> (yesenia,ssh-connection) [preauth]
Jan 26 15:00:49 compute-1 openstack_network_exporter[195610]: ERROR   15:00:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:00:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:00:49 compute-1 openstack_network_exporter[195610]: ERROR   15:00:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:00:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:00:51 compute-1 nova_compute[183403]: 2026-01-26 15:00:51.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:00:51 compute-1 nova_compute[183403]: 2026-01-26 15:00:51.577 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Jan 26 15:00:52 compute-1 nova_compute[183403]: 2026-01-26 15:00:52.107 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Jan 26 15:00:52 compute-1 nova_compute[183403]: 2026-01-26 15:00:52.108 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:00:52 compute-1 nova_compute[183403]: 2026-01-26 15:00:52.109 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Jan 26 15:00:52 compute-1 nova_compute[183403]: 2026-01-26 15:00:52.616 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:00:54 compute-1 nova_compute[183403]: 2026-01-26 15:00:54.123 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:00:54 compute-1 nova_compute[183403]: 2026-01-26 15:00:54.124 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:00:54 compute-1 nova_compute[183403]: 2026-01-26 15:00:54.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:00:54 compute-1 nova_compute[183403]: 2026-01-26 15:00:54.577 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:00:55 compute-1 nova_compute[183403]: 2026-01-26 15:00:55.098 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:00:55 compute-1 nova_compute[183403]: 2026-01-26 15:00:55.098 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:00:55 compute-1 nova_compute[183403]: 2026-01-26 15:00:55.099 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:00:55 compute-1 nova_compute[183403]: 2026-01-26 15:00:55.099 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:00:55 compute-1 nova_compute[183403]: 2026-01-26 15:00:55.304 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:00:55 compute-1 nova_compute[183403]: 2026-01-26 15:00:55.306 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:00:55 compute-1 nova_compute[183403]: 2026-01-26 15:00:55.324 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.018s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:00:55 compute-1 nova_compute[183403]: 2026-01-26 15:00:55.325 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6075MB free_disk=73.18566513061523GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:00:55 compute-1 nova_compute[183403]: 2026-01-26 15:00:55.325 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:00:55 compute-1 nova_compute[183403]: 2026-01-26 15:00:55.326 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:00:56 compute-1 nova_compute[183403]: 2026-01-26 15:00:56.385 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:00:56 compute-1 nova_compute[183403]: 2026-01-26 15:00:56.385 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:00:55 up 56 min,  0 user,  load average: 0.05, 0.45, 0.60\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:00:56 compute-1 nova_compute[183403]: 2026-01-26 15:00:56.405 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:00:56 compute-1 nova_compute[183403]: 2026-01-26 15:00:56.916 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:00:57 compute-1 nova_compute[183403]: 2026-01-26 15:00:57.434 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:00:57 compute-1 nova_compute[183403]: 2026-01-26 15:00:57.435 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.109s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:00:58 compute-1 nova_compute[183403]: 2026-01-26 15:00:58.434 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:00:58 compute-1 nova_compute[183403]: 2026-01-26 15:00:58.435 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:00:58 compute-1 nova_compute[183403]: 2026-01-26 15:00:58.435 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 15:00:58 compute-1 nova_compute[183403]: 2026-01-26 15:00:58.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:00:58 compute-1 nova_compute[183403]: 2026-01-26 15:00:58.577 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:00:59 compute-1 podman[203513]: 2026-01-26 15:00:59.886286559 +0000 UTC m=+0.066246424 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, io.openshift.expose-services=, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, version=9.6, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9)
Jan 26 15:01:01 compute-1 CROND[203548]: (root) CMD (run-parts /etc/cron.hourly)
Jan 26 15:01:01 compute-1 run-parts[203555]: (/etc/cron.hourly) starting 0anacron
Jan 26 15:01:01 compute-1 anacron[203574]: Anacron started on 2026-01-26
Jan 26 15:01:01 compute-1 anacron[203574]: Will run job `cron.daily' in 36 min.
Jan 26 15:01:01 compute-1 anacron[203574]: Will run job `cron.weekly' in 56 min.
Jan 26 15:01:01 compute-1 anacron[203574]: Will run job `cron.monthly' in 76 min.
Jan 26 15:01:01 compute-1 anacron[203574]: Jobs will be executed sequentially
Jan 26 15:01:01 compute-1 podman[203538]: 2026-01-26 15:01:01.887026355 +0000 UTC m=+0.063448767 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 26 15:01:01 compute-1 run-parts[203576]: (/etc/cron.hourly) finished 0anacron
Jan 26 15:01:01 compute-1 CROND[203547]: (root) CMDEND (run-parts /etc/cron.hourly)
Jan 26 15:01:05 compute-1 podman[192725]: time="2026-01-26T15:01:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:01:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:01:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:01:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:01:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2151 "" "Go-http-client/1.1"
Jan 26 15:01:07 compute-1 sshd-session[203512]: Invalid user yesenia from 185.246.128.170 port 27099
Jan 26 15:01:11 compute-1 sshd-session[203512]: Disconnecting invalid user yesenia 185.246.128.170 port 27099: Change of username or service not allowed: (yesenia,ssh-connection) -> (tester,ssh-connection) [preauth]
Jan 26 15:01:11 compute-1 podman[203578]: 2026-01-26 15:01:11.888339431 +0000 UTC m=+0.057030070 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 26 15:01:11 compute-1 podman[203577]: 2026-01-26 15:01:11.926770017 +0000 UTC m=+0.097191111 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.4)
Jan 26 15:01:19 compute-1 openstack_network_exporter[195610]: ERROR   15:01:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:01:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:01:19 compute-1 openstack_network_exporter[195610]: ERROR   15:01:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:01:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:01:20 compute-1 sshd-session[203622]: Invalid user tester from 185.246.128.170 port 13507
Jan 26 15:01:22 compute-1 sshd-session[203622]: Disconnecting invalid user tester 185.246.128.170 port 13507: Change of username or service not allowed: (tester,ssh-connection) -> (wang,ssh-connection) [preauth]
Jan 26 15:01:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:01:29.016 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:01:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:01:29.016 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:01:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:01:29.016 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:01:30 compute-1 podman[203627]: 2026-01-26 15:01:30.886495804 +0000 UTC m=+0.065696026 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, name=ubi9-minimal)
Jan 26 15:01:31 compute-1 sshd-session[203625]: Invalid user wang from 185.246.128.170 port 40759
Jan 26 15:01:32 compute-1 podman[203649]: 2026-01-26 15:01:32.006605292 +0000 UTC m=+0.059906379 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 15:01:33 compute-1 sshd-session[203625]: Disconnecting invalid user wang 185.246.128.170 port 40759: Change of username or service not allowed: (wang,ssh-connection) -> (zabbix,ssh-connection) [preauth]
Jan 26 15:01:35 compute-1 podman[192725]: time="2026-01-26T15:01:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:01:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:01:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:01:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:01:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2149 "" "Go-http-client/1.1"
Jan 26 15:01:42 compute-1 podman[203676]: 2026-01-26 15:01:42.884154577 +0000 UTC m=+0.063105629 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4)
Jan 26 15:01:42 compute-1 podman[203675]: 2026-01-26 15:01:42.915079949 +0000 UTC m=+0.094322440 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.4)
Jan 26 15:01:43 compute-1 sshd-session[203673]: Invalid user zabbix from 185.246.128.170 port 1325
Jan 26 15:01:48 compute-1 sshd-session[203673]: Disconnecting invalid user zabbix 185.246.128.170 port 1325: Change of username or service not allowed: (zabbix,ssh-connection) -> (admin,ssh-connection) [preauth]
Jan 26 15:01:49 compute-1 openstack_network_exporter[195610]: ERROR   15:01:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:01:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:01:49 compute-1 openstack_network_exporter[195610]: ERROR   15:01:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:01:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:01:52 compute-1 nova_compute[183403]: 2026-01-26 15:01:52.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:01:53 compute-1 nova_compute[183403]: 2026-01-26 15:01:53.577 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:01:54 compute-1 nova_compute[183403]: 2026-01-26 15:01:54.577 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:01:55 compute-1 nova_compute[183403]: 2026-01-26 15:01:55.095 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:01:55 compute-1 nova_compute[183403]: 2026-01-26 15:01:55.095 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:01:55 compute-1 nova_compute[183403]: 2026-01-26 15:01:55.095 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:01:55 compute-1 nova_compute[183403]: 2026-01-26 15:01:55.095 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:01:55 compute-1 nova_compute[183403]: 2026-01-26 15:01:55.229 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:01:55 compute-1 nova_compute[183403]: 2026-01-26 15:01:55.230 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:01:55 compute-1 nova_compute[183403]: 2026-01-26 15:01:55.264 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.034s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:01:55 compute-1 nova_compute[183403]: 2026-01-26 15:01:55.264 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6090MB free_disk=73.18535995483398GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:01:55 compute-1 nova_compute[183403]: 2026-01-26 15:01:55.265 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:01:55 compute-1 nova_compute[183403]: 2026-01-26 15:01:55.265 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:01:56 compute-1 nova_compute[183403]: 2026-01-26 15:01:56.341 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:01:56 compute-1 nova_compute[183403]: 2026-01-26 15:01:56.341 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:01:55 up 57 min,  0 user,  load average: 0.02, 0.37, 0.56\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:01:56 compute-1 nova_compute[183403]: 2026-01-26 15:01:56.392 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Refreshing inventories for resource provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Jan 26 15:01:56 compute-1 nova_compute[183403]: 2026-01-26 15:01:56.430 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Updating ProviderTree inventory for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Jan 26 15:01:56 compute-1 nova_compute[183403]: 2026-01-26 15:01:56.430 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Updating inventory in ProviderTree for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Jan 26 15:01:56 compute-1 nova_compute[183403]: 2026-01-26 15:01:56.443 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Refreshing aggregate associations for resource provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Jan 26 15:01:56 compute-1 nova_compute[183403]: 2026-01-26 15:01:56.461 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Refreshing trait associations for resource provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67, traits: COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE41,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NODE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_MMX,COMPUTE_ACCELERATORS,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_ARCH_X86_64,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE2,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_SOUND_MODEL_USB,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_TIS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_DEVICE_TAGGING,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_CRB,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_SOUND_MODEL_SB16,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_QCOW2,HW_ARCH_X86_64,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SECURITY_TPM_2_0 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Jan 26 15:01:56 compute-1 nova_compute[183403]: 2026-01-26 15:01:56.487 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:01:56 compute-1 nova_compute[183403]: 2026-01-26 15:01:56.997 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:01:57 compute-1 nova_compute[183403]: 2026-01-26 15:01:57.504 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:01:57 compute-1 nova_compute[183403]: 2026-01-26 15:01:57.505 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.240s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:01:57 compute-1 sshd-session[203720]: Invalid user admin from 185.246.128.170 port 14888
Jan 26 15:01:58 compute-1 nova_compute[183403]: 2026-01-26 15:01:58.501 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:01:59 compute-1 nova_compute[183403]: 2026-01-26 15:01:59.070 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:01:59 compute-1 nova_compute[183403]: 2026-01-26 15:01:59.070 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:01:59 compute-1 nova_compute[183403]: 2026-01-26 15:01:59.070 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 15:01:59 compute-1 nova_compute[183403]: 2026-01-26 15:01:59.141 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:01:59 compute-1 nova_compute[183403]: 2026-01-26 15:01:59.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:02:00 compute-1 nova_compute[183403]: 2026-01-26 15:02:00.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:02:01 compute-1 podman[203723]: 2026-01-26 15:02:01.920206185 +0000 UTC m=+0.092562133 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, config_id=openstack_network_exporter, architecture=x86_64, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., release=1755695350, distribution-scope=public)
Jan 26 15:02:02 compute-1 podman[203744]: 2026-01-26 15:02:02.878164428 +0000 UTC m=+0.056937105 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 15:02:05 compute-1 podman[192725]: time="2026-01-26T15:02:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:02:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:02:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:02:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:02:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2150 "" "Go-http-client/1.1"
Jan 26 15:02:08 compute-1 sshd-session[203720]: error: maximum authentication attempts exceeded for invalid user admin from 185.246.128.170 port 14888 ssh2 [preauth]
Jan 26 15:02:08 compute-1 sshd-session[203720]: Disconnecting invalid user admin 185.246.128.170 port 14888: Too many authentication failures [preauth]
Jan 26 15:02:12 compute-1 sshd-session[203769]: Invalid user admin from 185.246.128.170 port 57504
Jan 26 15:02:13 compute-1 podman[203772]: 2026-01-26 15:02:13.898394392 +0000 UTC m=+0.070778357 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20260120, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Jan 26 15:02:14 compute-1 podman[203771]: 2026-01-26 15:02:14.02767342 +0000 UTC m=+0.193657337 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Jan 26 15:02:19 compute-1 sshd-session[203769]: error: maximum authentication attempts exceeded for invalid user admin from 185.246.128.170 port 57504 ssh2 [preauth]
Jan 26 15:02:19 compute-1 sshd-session[203769]: Disconnecting invalid user admin 185.246.128.170 port 57504: Too many authentication failures [preauth]
Jan 26 15:02:19 compute-1 openstack_network_exporter[195610]: ERROR   15:02:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:02:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:02:19 compute-1 openstack_network_exporter[195610]: ERROR   15:02:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:02:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:02:23 compute-1 sshd-session[203813]: Invalid user admin from 185.246.128.170 port 15410
Jan 26 15:02:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:02:29.017 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:02:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:02:29.018 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:02:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:02:29.018 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:02:30 compute-1 sshd-session[203813]: error: maximum authentication attempts exceeded for invalid user admin from 185.246.128.170 port 15410 ssh2 [preauth]
Jan 26 15:02:30 compute-1 sshd-session[203813]: Disconnecting invalid user admin 185.246.128.170 port 15410: Too many authentication failures [preauth]
Jan 26 15:02:32 compute-1 podman[203816]: 2026-01-26 15:02:32.888933126 +0000 UTC m=+0.061880001 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, architecture=x86_64, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, distribution-scope=public, io.openshift.expose-services=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_id=openstack_network_exporter, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41)
Jan 26 15:02:32 compute-1 podman[203837]: 2026-01-26 15:02:32.966815147 +0000 UTC m=+0.054858337 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 15:02:35 compute-1 sshd-session[203862]: Invalid user admin from 185.246.128.170 port 7527
Jan 26 15:02:35 compute-1 podman[192725]: time="2026-01-26T15:02:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:02:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:02:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:02:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:02:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2152 "" "Go-http-client/1.1"
Jan 26 15:02:44 compute-1 podman[203865]: 2026-01-26 15:02:44.884925859 +0000 UTC m=+0.058084432 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 26 15:02:44 compute-1 podman[203864]: 2026-01-26 15:02:44.916138156 +0000 UTC m=+0.096514337 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 26 15:02:47 compute-1 sshd-session[203862]: error: maximum authentication attempts exceeded for invalid user admin from 185.246.128.170 port 7527 ssh2 [preauth]
Jan 26 15:02:47 compute-1 sshd-session[203862]: Disconnecting invalid user admin 185.246.128.170 port 7527: Too many authentication failures [preauth]
Jan 26 15:02:49 compute-1 openstack_network_exporter[195610]: ERROR   15:02:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:02:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:02:49 compute-1 openstack_network_exporter[195610]: ERROR   15:02:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:02:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:02:53 compute-1 nova_compute[183403]: 2026-01-26 15:02:53.577 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:02:54 compute-1 nova_compute[183403]: 2026-01-26 15:02:54.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:02:55 compute-1 nova_compute[183403]: 2026-01-26 15:02:55.575 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:02:55 compute-1 nova_compute[183403]: 2026-01-26 15:02:55.576 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 15:02:55 compute-1 nova_compute[183403]: 2026-01-26 15:02:55.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:02:56 compute-1 nova_compute[183403]: 2026-01-26 15:02:56.089 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:02:56 compute-1 nova_compute[183403]: 2026-01-26 15:02:56.089 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:02:56 compute-1 nova_compute[183403]: 2026-01-26 15:02:56.090 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:02:56 compute-1 nova_compute[183403]: 2026-01-26 15:02:56.090 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:02:56 compute-1 nova_compute[183403]: 2026-01-26 15:02:56.237 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:02:56 compute-1 nova_compute[183403]: 2026-01-26 15:02:56.238 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:02:56 compute-1 nova_compute[183403]: 2026-01-26 15:02:56.267 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.029s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:02:56 compute-1 nova_compute[183403]: 2026-01-26 15:02:56.268 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6081MB free_disk=73.18490982055664GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:02:56 compute-1 nova_compute[183403]: 2026-01-26 15:02:56.268 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:02:56 compute-1 nova_compute[183403]: 2026-01-26 15:02:56.268 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:02:57 compute-1 nova_compute[183403]: 2026-01-26 15:02:57.324 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:02:57 compute-1 nova_compute[183403]: 2026-01-26 15:02:57.325 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:02:56 up 58 min,  0 user,  load average: 0.00, 0.30, 0.52\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:02:57 compute-1 nova_compute[183403]: 2026-01-26 15:02:57.348 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:02:57 compute-1 nova_compute[183403]: 2026-01-26 15:02:57.855 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:02:58 compute-1 nova_compute[183403]: 2026-01-26 15:02:58.367 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:02:58 compute-1 nova_compute[183403]: 2026-01-26 15:02:58.367 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.099s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:02:59 compute-1 nova_compute[183403]: 2026-01-26 15:02:59.369 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:02:59 compute-1 nova_compute[183403]: 2026-01-26 15:02:59.573 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:02:59 compute-1 nova_compute[183403]: 2026-01-26 15:02:59.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:03:01 compute-1 nova_compute[183403]: 2026-01-26 15:03:01.577 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:03:03 compute-1 podman[203911]: 2026-01-26 15:03:03.872622796 +0000 UTC m=+0.054910679 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 15:03:03 compute-1 podman[203912]: 2026-01-26 15:03:03.875336009 +0000 UTC m=+0.054039639 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, architecture=x86_64, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., name=ubi9-minimal, release=1755695350, vendor=Red Hat, Inc., version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git)
Jan 26 15:03:05 compute-1 podman[192725]: time="2026-01-26T15:03:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:03:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:03:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:03:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:03:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2153 "" "Go-http-client/1.1"
Jan 26 15:03:07 compute-1 sshd-session[203909]: Invalid user admin from 185.246.128.170 port 1631
Jan 26 15:03:15 compute-1 podman[203956]: 2026-01-26 15:03:15.903654636 +0000 UTC m=+0.072564328 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS)
Jan 26 15:03:15 compute-1 podman[203955]: 2026-01-26 15:03:15.931283149 +0000 UTC m=+0.109438036 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20260120, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 15:03:19 compute-1 openstack_network_exporter[195610]: ERROR   15:03:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:03:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:03:19 compute-1 openstack_network_exporter[195610]: ERROR   15:03:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:03:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:03:21 compute-1 sshd-session[203909]: error: maximum authentication attempts exceeded for invalid user admin from 185.246.128.170 port 1631 ssh2 [preauth]
Jan 26 15:03:21 compute-1 sshd-session[203909]: Disconnecting invalid user admin 185.246.128.170 port 1631: Too many authentication failures [preauth]
Jan 26 15:03:26 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:03:26.117 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:ac:38', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'de:96:40:90:f3:e5'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:03:26 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:03:26.117 104930 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 15:03:26 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:03:26.118 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=41380b5a-e321-4ce4-bcc6-ecd563b3c793, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:03:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:03:29.019 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:03:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:03:29.020 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:03:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:03:29.020 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:03:30 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:03:30.043 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ca:43:3c 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-e53a0f22-1bc2-42af-b8a6-9e3ff9450836', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e53a0f22-1bc2-42af-b8a6-9e3ff9450836', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4bfb8226246a4462b2804cf37114f28b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4a2eea81-4d53-45c8-8d70-b5f44c3e1f84, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=f142ea1c-b095-4cf2-802c-bc432f1e597e) old=Port_Binding(mac=['fa:16:3e:ca:43:3c'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-e53a0f22-1bc2-42af-b8a6-9e3ff9450836', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e53a0f22-1bc2-42af-b8a6-9e3ff9450836', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4bfb8226246a4462b2804cf37114f28b', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:03:30 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:03:30.044 104930 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port f142ea1c-b095-4cf2-802c-bc432f1e597e in datapath e53a0f22-1bc2-42af-b8a6-9e3ff9450836 updated
Jan 26 15:03:30 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:03:30.044 104930 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e53a0f22-1bc2-42af-b8a6-9e3ff9450836, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 15:03:30 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:03:30.046 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[96635d56-ec2d-484e-a28d-4209ab96a5d4]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:03:34 compute-1 podman[204003]: 2026-01-26 15:03:34.883473872 +0000 UTC m=+0.056596240 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 15:03:34 compute-1 podman[204004]: 2026-01-26 15:03:34.926782856 +0000 UTC m=+0.091014590 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, release=1755695350, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-type=git, com.redhat.component=ubi9-minimal-container, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, managed_by=edpm_ansible, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 26 15:03:35 compute-1 podman[192725]: time="2026-01-26T15:03:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:03:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:03:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:03:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:03:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2153 "" "Go-http-client/1.1"
Jan 26 15:03:41 compute-1 sshd-session[204001]: Invalid user admin from 185.246.128.170 port 51456
Jan 26 15:03:43 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:03:43.962 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a8:85:b2 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-637c4ad6-7314-4839-8fd8-841be66eb54e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-637c4ad6-7314-4839-8fd8-841be66eb54e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '77301a970e0e4c3d8d126ff6d547b93c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=31d22c9f-7d3b-403f-bdd5-5675521a2c91, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=6d3b4beb-f881-4114-97e5-2b03f5be79cc) old=Port_Binding(mac=['fa:16:3e:a8:85:b2'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-637c4ad6-7314-4839-8fd8-841be66eb54e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-637c4ad6-7314-4839-8fd8-841be66eb54e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '77301a970e0e4c3d8d126ff6d547b93c', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:03:43 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:03:43.963 104930 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 6d3b4beb-f881-4114-97e5-2b03f5be79cc in datapath 637c4ad6-7314-4839-8fd8-841be66eb54e updated
Jan 26 15:03:43 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:03:43.964 104930 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 637c4ad6-7314-4839-8fd8-841be66eb54e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 15:03:43 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:03:43.965 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[7c5d8e17-06f5-4a75-95b8-5d2e67798f7d]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:03:46 compute-1 podman[204049]: 2026-01-26 15:03:46.905364386 +0000 UTC m=+0.084019279 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 26 15:03:46 compute-1 podman[204048]: 2026-01-26 15:03:46.922251884 +0000 UTC m=+0.100469907 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 26 15:03:48 compute-1 sshd-session[204001]: error: maximum authentication attempts exceeded for invalid user admin from 185.246.128.170 port 51456 ssh2 [preauth]
Jan 26 15:03:48 compute-1 sshd-session[204001]: Disconnecting invalid user admin 185.246.128.170 port 51456: Too many authentication failures [preauth]
Jan 26 15:03:49 compute-1 openstack_network_exporter[195610]: ERROR   15:03:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:03:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:03:49 compute-1 openstack_network_exporter[195610]: ERROR   15:03:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:03:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:03:53 compute-1 nova_compute[183403]: 2026-01-26 15:03:53.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:03:54 compute-1 nova_compute[183403]: 2026-01-26 15:03:54.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:03:55 compute-1 nova_compute[183403]: 2026-01-26 15:03:55.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:03:55 compute-1 nova_compute[183403]: 2026-01-26 15:03:55.576 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 15:03:57 compute-1 nova_compute[183403]: 2026-01-26 15:03:57.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:03:57 compute-1 nova_compute[183403]: 2026-01-26 15:03:57.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:03:58 compute-1 nova_compute[183403]: 2026-01-26 15:03:58.090 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:03:58 compute-1 nova_compute[183403]: 2026-01-26 15:03:58.091 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:03:58 compute-1 nova_compute[183403]: 2026-01-26 15:03:58.091 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:03:58 compute-1 nova_compute[183403]: 2026-01-26 15:03:58.091 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:03:58 compute-1 nova_compute[183403]: 2026-01-26 15:03:58.223 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:03:58 compute-1 nova_compute[183403]: 2026-01-26 15:03:58.224 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:03:58 compute-1 nova_compute[183403]: 2026-01-26 15:03:58.239 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.015s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:03:58 compute-1 nova_compute[183403]: 2026-01-26 15:03:58.239 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6081MB free_disk=73.18490982055664GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:03:58 compute-1 nova_compute[183403]: 2026-01-26 15:03:58.240 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:03:58 compute-1 nova_compute[183403]: 2026-01-26 15:03:58.240 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:04:00 compute-1 nova_compute[183403]: 2026-01-26 15:04:00.405 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:04:00 compute-1 nova_compute[183403]: 2026-01-26 15:04:00.406 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:03:58 up 59 min,  0 user,  load average: 0.06, 0.25, 0.49\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:04:00 compute-1 nova_compute[183403]: 2026-01-26 15:04:00.430 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:04:01 compute-1 nova_compute[183403]: 2026-01-26 15:04:01.019 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:04:01 compute-1 nova_compute[183403]: 2026-01-26 15:04:01.528 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:04:01 compute-1 nova_compute[183403]: 2026-01-26 15:04:01.528 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.289s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:04:02 compute-1 nova_compute[183403]: 2026-01-26 15:04:02.523 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:04:02 compute-1 nova_compute[183403]: 2026-01-26 15:04:02.523 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:04:03 compute-1 nova_compute[183403]: 2026-01-26 15:04:03.032 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:04:03 compute-1 nova_compute[183403]: 2026-01-26 15:04:03.032 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:04:03 compute-1 sshd-session[204091]: Invalid user admin from 185.246.128.170 port 26236
Jan 26 15:04:05 compute-1 podman[192725]: time="2026-01-26T15:04:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:04:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:04:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:04:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:04:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2151 "" "Go-http-client/1.1"
Jan 26 15:04:05 compute-1 podman[204095]: 2026-01-26 15:04:05.889907642 +0000 UTC m=+0.064660176 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=minimal rhel9, version=9.6, io.openshift.expose-services=, vcs-type=git, com.redhat.component=ubi9-minimal-container)
Jan 26 15:04:05 compute-1 podman[204094]: 2026-01-26 15:04:05.899024591 +0000 UTC m=+0.063764835 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 15:04:13 compute-1 sshd-session[204091]: error: maximum authentication attempts exceeded for invalid user admin from 185.246.128.170 port 26236 ssh2 [preauth]
Jan 26 15:04:13 compute-1 sshd-session[204091]: Disconnecting invalid user admin 185.246.128.170 port 26236: Too many authentication failures [preauth]
Jan 26 15:04:17 compute-1 podman[204143]: 2026-01-26 15:04:17.881301208 +0000 UTC m=+0.057960292 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 26 15:04:17 compute-1 podman[204142]: 2026-01-26 15:04:17.905275028 +0000 UTC m=+0.086985927 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller)
Jan 26 15:04:19 compute-1 openstack_network_exporter[195610]: ERROR   15:04:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:04:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:04:19 compute-1 openstack_network_exporter[195610]: ERROR   15:04:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:04:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:04:21 compute-1 sshd-session[204140]: Invalid user admin from 185.246.128.170 port 8295
Jan 26 15:04:22 compute-1 sshd-session[204140]: error: maximum authentication attempts exceeded for invalid user admin from 185.246.128.170 port 8295 ssh2 [preauth]
Jan 26 15:04:22 compute-1 sshd-session[204140]: Disconnecting invalid user admin 185.246.128.170 port 8295: Too many authentication failures [preauth]
Jan 26 15:04:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:04:29.021 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:04:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:04:29.021 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:04:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:04:29.022 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:04:30 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:04:30.951 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:ac:38', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'de:96:40:90:f3:e5'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:04:30 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:04:30.952 104930 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 15:04:34 compute-1 sshd-session[204186]: Invalid user admin from 185.246.128.170 port 24103
Jan 26 15:04:35 compute-1 podman[192725]: time="2026-01-26T15:04:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:04:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:04:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:04:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:04:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2154 "" "Go-http-client/1.1"
Jan 26 15:04:36 compute-1 podman[204190]: 2026-01-26 15:04:36.884488794 +0000 UTC m=+0.059005554 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 15:04:36 compute-1 podman[204191]: 2026-01-26 15:04:36.91439877 +0000 UTC m=+0.075873871 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, io.buildah.version=1.33.7, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git)
Jan 26 15:04:40 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:04:40.954 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=41380b5a-e321-4ce4-bcc6-ecd563b3c793, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:04:42 compute-1 sshd-session[204186]: error: maximum authentication attempts exceeded for invalid user admin from 185.246.128.170 port 24103 ssh2 [preauth]
Jan 26 15:04:42 compute-1 sshd-session[204186]: Disconnecting invalid user admin 185.246.128.170 port 24103: Too many authentication failures [preauth]
Jan 26 15:04:44 compute-1 sshd-session[204236]: Invalid user admin from 185.246.128.170 port 16542
Jan 26 15:04:48 compute-1 podman[204239]: 2026-01-26 15:04:48.923618537 +0000 UTC m=+0.088427509 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 26 15:04:48 compute-1 podman[204238]: 2026-01-26 15:04:48.973058971 +0000 UTC m=+0.150042942 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 26 15:04:49 compute-1 openstack_network_exporter[195610]: ERROR   15:04:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:04:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:04:49 compute-1 openstack_network_exporter[195610]: ERROR   15:04:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:04:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:04:53 compute-1 sshd-session[204284]: Connection closed by 80.94.92.171 port 57124
Jan 26 15:04:54 compute-1 nova_compute[183403]: 2026-01-26 15:04:54.575 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:04:56 compute-1 nova_compute[183403]: 2026-01-26 15:04:56.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:04:57 compute-1 nova_compute[183403]: 2026-01-26 15:04:57.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:04:57 compute-1 nova_compute[183403]: 2026-01-26 15:04:57.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:04:57 compute-1 nova_compute[183403]: 2026-01-26 15:04:57.576 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 15:04:57 compute-1 nova_compute[183403]: 2026-01-26 15:04:57.577 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:04:58 compute-1 nova_compute[183403]: 2026-01-26 15:04:58.159 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:04:58 compute-1 nova_compute[183403]: 2026-01-26 15:04:58.159 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:04:58 compute-1 nova_compute[183403]: 2026-01-26 15:04:58.160 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:04:58 compute-1 nova_compute[183403]: 2026-01-26 15:04:58.160 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:04:58 compute-1 nova_compute[183403]: 2026-01-26 15:04:58.334 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:04:58 compute-1 nova_compute[183403]: 2026-01-26 15:04:58.336 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:04:58 compute-1 nova_compute[183403]: 2026-01-26 15:04:58.354 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.018s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:04:58 compute-1 nova_compute[183403]: 2026-01-26 15:04:58.355 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6077MB free_disk=73.18488693237305GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:04:58 compute-1 nova_compute[183403]: 2026-01-26 15:04:58.355 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:04:58 compute-1 nova_compute[183403]: 2026-01-26 15:04:58.356 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:04:59 compute-1 nova_compute[183403]: 2026-01-26 15:04:59.529 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:04:59 compute-1 nova_compute[183403]: 2026-01-26 15:04:59.530 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:04:58 up  1:00,  0 user,  load average: 0.02, 0.20, 0.45\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:04:59 compute-1 nova_compute[183403]: 2026-01-26 15:04:59.556 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:05:00 compute-1 nova_compute[183403]: 2026-01-26 15:05:00.112 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:05:00 compute-1 nova_compute[183403]: 2026-01-26 15:05:00.638 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:05:00 compute-1 nova_compute[183403]: 2026-01-26 15:05:00.639 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.283s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:05:02 compute-1 nova_compute[183403]: 2026-01-26 15:05:02.640 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:05:02 compute-1 nova_compute[183403]: 2026-01-26 15:05:02.640 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:05:03 compute-1 nova_compute[183403]: 2026-01-26 15:05:03.572 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:05:05 compute-1 sshd-session[204236]: error: maximum authentication attempts exceeded for invalid user admin from 185.246.128.170 port 16542 ssh2 [preauth]
Jan 26 15:05:05 compute-1 sshd-session[204236]: Disconnecting invalid user admin 185.246.128.170 port 16542: Too many authentication failures [preauth]
Jan 26 15:05:05 compute-1 podman[192725]: time="2026-01-26T15:05:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:05:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:05:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:05:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:05:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2156 "" "Go-http-client/1.1"
Jan 26 15:05:07 compute-1 podman[204287]: 2026-01-26 15:05:07.896561331 +0000 UTC m=+0.064623993 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 15:05:07 compute-1 podman[204288]: 2026-01-26 15:05:07.909795415 +0000 UTC m=+0.071555272 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, version=9.6, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41)
Jan 26 15:05:19 compute-1 openstack_network_exporter[195610]: ERROR   15:05:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:05:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:05:19 compute-1 openstack_network_exporter[195610]: ERROR   15:05:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:05:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:05:20 compute-1 podman[204332]: 2026-01-26 15:05:20.152525668 +0000 UTC m=+0.329048947 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible)
Jan 26 15:05:20 compute-1 podman[204333]: 2026-01-26 15:05:20.161667647 +0000 UTC m=+0.332703390 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 26 15:05:25 compute-1 sshd-session[204330]: Invalid user admin from 185.246.128.170 port 14913
Jan 26 15:05:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:05:29.022 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:05:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:05:29.022 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:05:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:05:29.023 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:05:30 compute-1 sshd-session[204330]: error: maximum authentication attempts exceeded for invalid user admin from 185.246.128.170 port 14913 ssh2 [preauth]
Jan 26 15:05:30 compute-1 sshd-session[204330]: Disconnecting invalid user admin 185.246.128.170 port 14913: Too many authentication failures [preauth]
Jan 26 15:05:31 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:05:31.391 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:ac:38', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'de:96:40:90:f3:e5'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:05:31 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:05:31.392 104930 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 15:05:31 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:05:31.520 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8f:71:4e 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-32ec1104-f9cc-4957-b081-02d91f2430ae', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-32ec1104-f9cc-4957-b081-02d91f2430ae', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3dd6a2d8596443749a561af56bd6c6e6', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=081594a4-a28d-46d7-a62f-5448181a5a8e, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=a8d358b6-6e12-4dfe-8bd4-900037efed58) old=Port_Binding(mac=['fa:16:3e:8f:71:4e'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-32ec1104-f9cc-4957-b081-02d91f2430ae', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-32ec1104-f9cc-4957-b081-02d91f2430ae', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3dd6a2d8596443749a561af56bd6c6e6', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:05:31 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:05:31.521 104930 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port a8d358b6-6e12-4dfe-8bd4-900037efed58 in datapath 32ec1104-f9cc-4957-b081-02d91f2430ae updated
Jan 26 15:05:31 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:05:31.522 104930 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 32ec1104-f9cc-4957-b081-02d91f2430ae, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 15:05:31 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:05:31.523 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[edfa986f-1c18-44dd-9329-9bef9976d20c]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:05:35 compute-1 podman[192725]: time="2026-01-26T15:05:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:05:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:05:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:05:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:05:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2153 "" "Go-http-client/1.1"
Jan 26 15:05:36 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:05:36.393 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=41380b5a-e321-4ce4-bcc6-ecd563b3c793, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:05:38 compute-1 podman[204377]: 2026-01-26 15:05:38.868307928 +0000 UTC m=+0.048672175 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 26 15:05:38 compute-1 podman[204378]: 2026-01-26 15:05:38.881107967 +0000 UTC m=+0.056100294 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.openshift.expose-services=, name=ubi9-minimal, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public)
Jan 26 15:05:43 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:05:43.428 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f0:94:cf 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-4f9424c4-cc96-4a87-bc7c-47a630056f2a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4f9424c4-cc96-4a87-bc7c-47a630056f2a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '84dcf9273f5342fea8d5f6c33adc15c6', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3f883cbe-8c8a-4f43-ad04-475b8632f5f3, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=0160f431-c5a6-49ae-af46-42eca2a95dd5) old=Port_Binding(mac=['fa:16:3e:f0:94:cf'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-4f9424c4-cc96-4a87-bc7c-47a630056f2a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4f9424c4-cc96-4a87-bc7c-47a630056f2a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '84dcf9273f5342fea8d5f6c33adc15c6', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:05:43 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:05:43.429 104930 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 0160f431-c5a6-49ae-af46-42eca2a95dd5 in datapath 4f9424c4-cc96-4a87-bc7c-47a630056f2a updated
Jan 26 15:05:43 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:05:43.430 104930 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4f9424c4-cc96-4a87-bc7c-47a630056f2a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 15:05:43 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:05:43.430 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[ab5ebf71-d965-4129-8a92-ddb088bfdf13]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:05:49 compute-1 openstack_network_exporter[195610]: ERROR   15:05:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:05:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:05:49 compute-1 openstack_network_exporter[195610]: ERROR   15:05:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:05:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:05:50 compute-1 podman[204424]: 2026-01-26 15:05:50.911107149 +0000 UTC m=+0.093891912 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4)
Jan 26 15:05:50 compute-1 podman[204425]: 2026-01-26 15:05:50.916764399 +0000 UTC m=+0.088915546 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS)
Jan 26 15:05:51 compute-1 sshd-session[204422]: Invalid user admin from 185.246.128.170 port 19690
Jan 26 15:05:54 compute-1 nova_compute[183403]: 2026-01-26 15:05:54.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:05:56 compute-1 nova_compute[183403]: 2026-01-26 15:05:56.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:05:57 compute-1 nova_compute[183403]: 2026-01-26 15:05:57.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:05:57 compute-1 nova_compute[183403]: 2026-01-26 15:05:57.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:05:57 compute-1 nova_compute[183403]: 2026-01-26 15:05:57.576 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 15:05:57 compute-1 nova_compute[183403]: 2026-01-26 15:05:57.577 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:05:58 compute-1 nova_compute[183403]: 2026-01-26 15:05:58.239 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:05:58 compute-1 nova_compute[183403]: 2026-01-26 15:05:58.239 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:05:58 compute-1 nova_compute[183403]: 2026-01-26 15:05:58.239 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:05:58 compute-1 nova_compute[183403]: 2026-01-26 15:05:58.240 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:05:58 compute-1 sshd-session[204422]: Disconnecting invalid user admin 185.246.128.170 port 19690: Change of username or service not allowed: (admin,ssh-connection) -> (download,ssh-connection) [preauth]
Jan 26 15:05:58 compute-1 nova_compute[183403]: 2026-01-26 15:05:58.388 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:05:58 compute-1 nova_compute[183403]: 2026-01-26 15:05:58.389 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:05:58 compute-1 nova_compute[183403]: 2026-01-26 15:05:58.403 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.014s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:05:58 compute-1 nova_compute[183403]: 2026-01-26 15:05:58.404 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6079MB free_disk=73.18490600585938GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:05:58 compute-1 nova_compute[183403]: 2026-01-26 15:05:58.405 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:05:58 compute-1 nova_compute[183403]: 2026-01-26 15:05:58.405 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:06:00 compute-1 nova_compute[183403]: 2026-01-26 15:06:00.303 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:06:00 compute-1 nova_compute[183403]: 2026-01-26 15:06:00.304 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:05:58 up  1:01,  0 user,  load average: 0.01, 0.16, 0.42\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:06:00 compute-1 nova_compute[183403]: 2026-01-26 15:06:00.365 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:06:00 compute-1 nova_compute[183403]: 2026-01-26 15:06:00.998 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:06:01 compute-1 nova_compute[183403]: 2026-01-26 15:06:01.536 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:06:01 compute-1 nova_compute[183403]: 2026-01-26 15:06:01.536 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.131s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:06:01 compute-1 nova_compute[183403]: 2026-01-26 15:06:01.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:06:02 compute-1 nova_compute[183403]: 2026-01-26 15:06:02.161 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:06:02 compute-1 nova_compute[183403]: 2026-01-26 15:06:02.161 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:06:02 compute-1 nova_compute[183403]: 2026-01-26 15:06:02.162 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Jan 26 15:06:02 compute-1 nova_compute[183403]: 2026-01-26 15:06:02.676 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Jan 26 15:06:03 compute-1 nova_compute[183403]: 2026-01-26 15:06:03.091 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:06:03 compute-1 nova_compute[183403]: 2026-01-26 15:06:03.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:06:03 compute-1 nova_compute[183403]: 2026-01-26 15:06:03.576 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Jan 26 15:06:04 compute-1 nova_compute[183403]: 2026-01-26 15:06:04.081 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:06:05 compute-1 sshd-session[204470]: Invalid user download from 185.246.128.170 port 37612
Jan 26 15:06:05 compute-1 podman[192725]: time="2026-01-26T15:06:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:06:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:06:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:06:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:06:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2153 "" "Go-http-client/1.1"
Jan 26 15:06:06 compute-1 nova_compute[183403]: 2026-01-26 15:06:06.582 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:06:09 compute-1 podman[204472]: 2026-01-26 15:06:09.881059216 +0000 UTC m=+0.058716242 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 15:06:09 compute-1 podman[204473]: 2026-01-26 15:06:09.920524401 +0000 UTC m=+0.089939960 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, release=1755695350, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., architecture=x86_64, distribution-scope=public, name=ubi9-minimal, vendor=Red Hat, Inc., io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, managed_by=edpm_ansible, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.component=ubi9-minimal-container)
Jan 26 15:06:10 compute-1 sshd-session[204470]: Disconnecting invalid user download 185.246.128.170 port 37612: Change of username or service not allowed: (download,ssh-connection) -> (bbs,ssh-connection) [preauth]
Jan 26 15:06:14 compute-1 sshd-session[204518]: Invalid user bbs from 185.246.128.170 port 9393
Jan 26 15:06:15 compute-1 sshd-session[204518]: Disconnecting invalid user bbs 185.246.128.170 port 9393: Change of username or service not allowed: (bbs,ssh-connection) -> (supervisor,ssh-connection) [preauth]
Jan 26 15:06:19 compute-1 openstack_network_exporter[195610]: ERROR   15:06:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:06:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:06:19 compute-1 openstack_network_exporter[195610]: ERROR   15:06:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:06:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:06:19 compute-1 nova_compute[183403]: 2026-01-26 15:06:19.848 183407 DEBUG oslo_concurrency.lockutils [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Acquiring lock "99dec4e9-b3d8-43a5-ac11-01f6490a6d99" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:06:19 compute-1 nova_compute[183403]: 2026-01-26 15:06:19.849 183407 DEBUG oslo_concurrency.lockutils [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Lock "99dec4e9-b3d8-43a5-ac11-01f6490a6d99" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:06:20 compute-1 nova_compute[183403]: 2026-01-26 15:06:20.354 183407 DEBUG nova.compute.manager [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Jan 26 15:06:20 compute-1 nova_compute[183403]: 2026-01-26 15:06:20.992 183407 DEBUG oslo_concurrency.lockutils [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:06:20 compute-1 nova_compute[183403]: 2026-01-26 15:06:20.992 183407 DEBUG oslo_concurrency.lockutils [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:06:21 compute-1 nova_compute[183403]: 2026-01-26 15:06:21.000 183407 DEBUG nova.virt.hardware [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Jan 26 15:06:21 compute-1 nova_compute[183403]: 2026-01-26 15:06:21.001 183407 INFO nova.compute.claims [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] Claim successful on node compute-1.ctlplane.example.com
Jan 26 15:06:21 compute-1 sshd-session[204520]: Invalid user supervisor from 185.246.128.170 port 46535
Jan 26 15:06:21 compute-1 podman[204523]: 2026-01-26 15:06:21.906015728 +0000 UTC m=+0.086195644 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 26 15:06:21 compute-1 podman[204522]: 2026-01-26 15:06:21.947168319 +0000 UTC m=+0.125489243 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.build-date=20260120, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0)
Jan 26 15:06:22 compute-1 nova_compute[183403]: 2026-01-26 15:06:22.059 183407 DEBUG nova.compute.provider_tree [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:06:22 compute-1 nova_compute[183403]: 2026-01-26 15:06:22.571 183407 DEBUG nova.scheduler.client.report [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:06:22 compute-1 sshd-session[204520]: Disconnecting invalid user supervisor 185.246.128.170 port 46535: Change of username or service not allowed: (supervisor,ssh-connection) -> (ftpguest,ssh-connection) [preauth]
Jan 26 15:06:23 compute-1 nova_compute[183403]: 2026-01-26 15:06:23.085 183407 DEBUG oslo_concurrency.lockutils [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.093s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:06:23 compute-1 nova_compute[183403]: 2026-01-26 15:06:23.086 183407 DEBUG nova.compute.manager [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Jan 26 15:06:23 compute-1 nova_compute[183403]: 2026-01-26 15:06:23.598 183407 DEBUG nova.compute.manager [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Jan 26 15:06:23 compute-1 nova_compute[183403]: 2026-01-26 15:06:23.599 183407 DEBUG nova.network.neutron [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Jan 26 15:06:23 compute-1 nova_compute[183403]: 2026-01-26 15:06:23.600 183407 WARNING neutronclient.v2_0.client [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:06:23 compute-1 nova_compute[183403]: 2026-01-26 15:06:23.601 183407 WARNING neutronclient.v2_0.client [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:06:24 compute-1 nova_compute[183403]: 2026-01-26 15:06:24.110 183407 INFO nova.virt.libvirt.driver [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 15:06:24 compute-1 nova_compute[183403]: 2026-01-26 15:06:24.621 183407 DEBUG nova.compute.manager [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Jan 26 15:06:25 compute-1 nova_compute[183403]: 2026-01-26 15:06:25.168 183407 DEBUG nova.network.neutron [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] Successfully created port: 5e99f9d9-3c39-42b0-a69f-a83f25565205 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Jan 26 15:06:25 compute-1 nova_compute[183403]: 2026-01-26 15:06:25.645 183407 DEBUG nova.compute.manager [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Jan 26 15:06:25 compute-1 nova_compute[183403]: 2026-01-26 15:06:25.647 183407 DEBUG nova.virt.libvirt.driver [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Jan 26 15:06:25 compute-1 nova_compute[183403]: 2026-01-26 15:06:25.648 183407 INFO nova.virt.libvirt.driver [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] Creating image(s)
Jan 26 15:06:25 compute-1 nova_compute[183403]: 2026-01-26 15:06:25.649 183407 DEBUG oslo_concurrency.lockutils [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Acquiring lock "/var/lib/nova/instances/99dec4e9-b3d8-43a5-ac11-01f6490a6d99/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:06:25 compute-1 nova_compute[183403]: 2026-01-26 15:06:25.650 183407 DEBUG oslo_concurrency.lockutils [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Lock "/var/lib/nova/instances/99dec4e9-b3d8-43a5-ac11-01f6490a6d99/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:06:25 compute-1 nova_compute[183403]: 2026-01-26 15:06:25.651 183407 DEBUG oslo_concurrency.lockutils [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Lock "/var/lib/nova/instances/99dec4e9-b3d8-43a5-ac11-01f6490a6d99/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:06:25 compute-1 nova_compute[183403]: 2026-01-26 15:06:25.652 183407 DEBUG oslo_concurrency.lockutils [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Acquiring lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:06:25 compute-1 nova_compute[183403]: 2026-01-26 15:06:25.653 183407 DEBUG oslo_concurrency.lockutils [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:06:26 compute-1 nova_compute[183403]: 2026-01-26 15:06:26.705 183407 DEBUG nova.network.neutron [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] Successfully updated port: 5e99f9d9-3c39-42b0-a69f-a83f25565205 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Jan 26 15:06:26 compute-1 nova_compute[183403]: 2026-01-26 15:06:26.818 183407 DEBUG nova.compute.manager [req-d3b9fc79-a592-4ebf-a09c-02d44d122fc2 req-a4795e23-54a1-48e4-b13f-4ebd09473269 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] Received event network-changed-5e99f9d9-3c39-42b0-a69f-a83f25565205 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:06:26 compute-1 nova_compute[183403]: 2026-01-26 15:06:26.819 183407 DEBUG nova.compute.manager [req-d3b9fc79-a592-4ebf-a09c-02d44d122fc2 req-a4795e23-54a1-48e4-b13f-4ebd09473269 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] Refreshing instance network info cache due to event network-changed-5e99f9d9-3c39-42b0-a69f-a83f25565205. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 26 15:06:26 compute-1 nova_compute[183403]: 2026-01-26 15:06:26.819 183407 DEBUG oslo_concurrency.lockutils [req-d3b9fc79-a592-4ebf-a09c-02d44d122fc2 req-a4795e23-54a1-48e4-b13f-4ebd09473269 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "refresh_cache-99dec4e9-b3d8-43a5-ac11-01f6490a6d99" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 15:06:26 compute-1 nova_compute[183403]: 2026-01-26 15:06:26.819 183407 DEBUG oslo_concurrency.lockutils [req-d3b9fc79-a592-4ebf-a09c-02d44d122fc2 req-a4795e23-54a1-48e4-b13f-4ebd09473269 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquired lock "refresh_cache-99dec4e9-b3d8-43a5-ac11-01f6490a6d99" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 15:06:26 compute-1 nova_compute[183403]: 2026-01-26 15:06:26.819 183407 DEBUG nova.network.neutron [req-d3b9fc79-a592-4ebf-a09c-02d44d122fc2 req-a4795e23-54a1-48e4-b13f-4ebd09473269 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] Refreshing network info cache for port 5e99f9d9-3c39-42b0-a69f-a83f25565205 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 26 15:06:26 compute-1 nova_compute[183403]: 2026-01-26 15:06:26.841 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'QFI\xfb') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:06:26 compute-1 nova_compute[183403]: 2026-01-26 15:06:26.843 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:06:26 compute-1 nova_compute[183403]: 2026-01-26 15:06:26.844 183407 DEBUG oslo_concurrency.processutils [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0.part --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:06:26 compute-1 nova_compute[183403]: 2026-01-26 15:06:26.899 183407 DEBUG oslo_concurrency.processutils [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0.part --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:06:26 compute-1 nova_compute[183403]: 2026-01-26 15:06:26.902 183407 DEBUG nova.virt.images [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] 354e4d0e-4287-404f-93d3-2c85cfe92fbc was qcow2, converting to raw fetch_to_raw /usr/lib/python3.12/site-packages/nova/virt/images.py:278
Jan 26 15:06:26 compute-1 nova_compute[183403]: 2026-01-26 15:06:26.903 183407 DEBUG nova.privsep.utils [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.12/site-packages/nova/privsep/utils.py:63
Jan 26 15:06:26 compute-1 nova_compute[183403]: 2026-01-26 15:06:26.904 183407 DEBUG oslo_concurrency.processutils [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0.part /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0.converted execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:06:27 compute-1 nova_compute[183403]: 2026-01-26 15:06:27.134 183407 DEBUG oslo_concurrency.processutils [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0.part /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0.converted" returned: 0 in 0.230s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:06:27 compute-1 nova_compute[183403]: 2026-01-26 15:06:27.138 183407 DEBUG oslo_concurrency.processutils [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0.converted --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:06:27 compute-1 nova_compute[183403]: 2026-01-26 15:06:27.185 183407 DEBUG oslo_concurrency.processutils [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0.converted --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:06:27 compute-1 nova_compute[183403]: 2026-01-26 15:06:27.186 183407 DEBUG oslo_concurrency.lockutils [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.533s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:06:27 compute-1 nova_compute[183403]: 2026-01-26 15:06:27.187 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:06:27 compute-1 nova_compute[183403]: 2026-01-26 15:06:27.194 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:06:27 compute-1 nova_compute[183403]: 2026-01-26 15:06:27.196 183407 INFO oslo.privsep.daemon [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpffwifdlz/privsep.sock']
Jan 26 15:06:27 compute-1 nova_compute[183403]: 2026-01-26 15:06:27.212 183407 DEBUG oslo_concurrency.lockutils [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Acquiring lock "refresh_cache-99dec4e9-b3d8-43a5-ac11-01f6490a6d99" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 15:06:27 compute-1 nova_compute[183403]: 2026-01-26 15:06:27.330 183407 WARNING neutronclient.v2_0.client [req-d3b9fc79-a592-4ebf-a09c-02d44d122fc2 req-a4795e23-54a1-48e4-b13f-4ebd09473269 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:06:27 compute-1 nova_compute[183403]: 2026-01-26 15:06:27.429 183407 DEBUG nova.network.neutron [req-d3b9fc79-a592-4ebf-a09c-02d44d122fc2 req-a4795e23-54a1-48e4-b13f-4ebd09473269 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 15:06:27 compute-1 nova_compute[183403]: 2026-01-26 15:06:27.937 183407 INFO oslo.privsep.daemon [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Spawned new privsep daemon via rootwrap
Jan 26 15:06:27 compute-1 nova_compute[183403]: 2026-01-26 15:06:27.785 204583 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 26 15:06:27 compute-1 nova_compute[183403]: 2026-01-26 15:06:27.790 204583 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 26 15:06:27 compute-1 nova_compute[183403]: 2026-01-26 15:06:27.792 204583 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Jan 26 15:06:27 compute-1 nova_compute[183403]: 2026-01-26 15:06:27.792 204583 INFO oslo.privsep.daemon [-] privsep daemon running as pid 204583
Jan 26 15:06:27 compute-1 nova_compute[183403]: 2026-01-26 15:06:27.971 183407 DEBUG nova.network.neutron [req-d3b9fc79-a592-4ebf-a09c-02d44d122fc2 req-a4795e23-54a1-48e4-b13f-4ebd09473269 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:06:28 compute-1 nova_compute[183403]: 2026-01-26 15:06:28.035 183407 DEBUG oslo_concurrency.processutils [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:06:28 compute-1 nova_compute[183403]: 2026-01-26 15:06:28.098 183407 DEBUG oslo_concurrency.processutils [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:06:28 compute-1 nova_compute[183403]: 2026-01-26 15:06:28.099 183407 DEBUG oslo_concurrency.lockutils [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Acquiring lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:06:28 compute-1 nova_compute[183403]: 2026-01-26 15:06:28.100 183407 DEBUG oslo_concurrency.lockutils [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:06:28 compute-1 nova_compute[183403]: 2026-01-26 15:06:28.100 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:06:28 compute-1 nova_compute[183403]: 2026-01-26 15:06:28.103 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:06:28 compute-1 nova_compute[183403]: 2026-01-26 15:06:28.103 183407 DEBUG oslo_concurrency.processutils [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:06:28 compute-1 nova_compute[183403]: 2026-01-26 15:06:28.152 183407 DEBUG oslo_concurrency.processutils [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:06:28 compute-1 nova_compute[183403]: 2026-01-26 15:06:28.153 183407 DEBUG oslo_concurrency.processutils [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0,backing_fmt=raw /var/lib/nova/instances/99dec4e9-b3d8-43a5-ac11-01f6490a6d99/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:06:28 compute-1 nova_compute[183403]: 2026-01-26 15:06:28.186 183407 DEBUG oslo_concurrency.processutils [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0,backing_fmt=raw /var/lib/nova/instances/99dec4e9-b3d8-43a5-ac11-01f6490a6d99/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:06:28 compute-1 nova_compute[183403]: 2026-01-26 15:06:28.188 183407 DEBUG oslo_concurrency.lockutils [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.088s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:06:28 compute-1 nova_compute[183403]: 2026-01-26 15:06:28.189 183407 DEBUG oslo_concurrency.processutils [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:06:28 compute-1 nova_compute[183403]: 2026-01-26 15:06:28.250 183407 DEBUG oslo_concurrency.processutils [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:06:28 compute-1 nova_compute[183403]: 2026-01-26 15:06:28.251 183407 DEBUG nova.virt.disk.api [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Checking if we can resize image /var/lib/nova/instances/99dec4e9-b3d8-43a5-ac11-01f6490a6d99/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 26 15:06:28 compute-1 nova_compute[183403]: 2026-01-26 15:06:28.251 183407 DEBUG oslo_concurrency.processutils [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/99dec4e9-b3d8-43a5-ac11-01f6490a6d99/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:06:28 compute-1 nova_compute[183403]: 2026-01-26 15:06:28.304 183407 DEBUG oslo_concurrency.processutils [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/99dec4e9-b3d8-43a5-ac11-01f6490a6d99/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:06:28 compute-1 nova_compute[183403]: 2026-01-26 15:06:28.306 183407 DEBUG nova.virt.disk.api [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Cannot resize image /var/lib/nova/instances/99dec4e9-b3d8-43a5-ac11-01f6490a6d99/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 26 15:06:28 compute-1 nova_compute[183403]: 2026-01-26 15:06:28.306 183407 DEBUG nova.virt.libvirt.driver [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Jan 26 15:06:28 compute-1 nova_compute[183403]: 2026-01-26 15:06:28.307 183407 DEBUG nova.virt.libvirt.driver [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] Ensure instance console log exists: /var/lib/nova/instances/99dec4e9-b3d8-43a5-ac11-01f6490a6d99/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Jan 26 15:06:28 compute-1 nova_compute[183403]: 2026-01-26 15:06:28.307 183407 DEBUG oslo_concurrency.lockutils [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:06:28 compute-1 nova_compute[183403]: 2026-01-26 15:06:28.307 183407 DEBUG oslo_concurrency.lockutils [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:06:28 compute-1 nova_compute[183403]: 2026-01-26 15:06:28.308 183407 DEBUG oslo_concurrency.lockutils [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:06:28 compute-1 nova_compute[183403]: 2026-01-26 15:06:28.490 183407 DEBUG oslo_concurrency.lockutils [req-d3b9fc79-a592-4ebf-a09c-02d44d122fc2 req-a4795e23-54a1-48e4-b13f-4ebd09473269 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Releasing lock "refresh_cache-99dec4e9-b3d8-43a5-ac11-01f6490a6d99" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 15:06:28 compute-1 nova_compute[183403]: 2026-01-26 15:06:28.491 183407 DEBUG oslo_concurrency.lockutils [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Acquired lock "refresh_cache-99dec4e9-b3d8-43a5-ac11-01f6490a6d99" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 15:06:28 compute-1 nova_compute[183403]: 2026-01-26 15:06:28.491 183407 DEBUG nova.network.neutron [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 15:06:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:29.023 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:06:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:29.024 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:06:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:29.024 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:06:29 compute-1 nova_compute[183403]: 2026-01-26 15:06:29.991 183407 DEBUG nova.network.neutron [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 15:06:30 compute-1 nova_compute[183403]: 2026-01-26 15:06:30.182 183407 WARNING neutronclient.v2_0.client [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:06:30 compute-1 nova_compute[183403]: 2026-01-26 15:06:30.347 183407 DEBUG nova.network.neutron [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] Updating instance_info_cache with network_info: [{"id": "5e99f9d9-3c39-42b0-a69f-a83f25565205", "address": "fa:16:3e:4c:09:cb", "network": {"id": "32ec1104-f9cc-4957-b081-02d91f2430ae", "bridge": "br-int", "label": "tempest-TestDataModel-546723541-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3dd6a2d8596443749a561af56bd6c6e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e99f9d9-3c", "ovs_interfaceid": "5e99f9d9-3c39-42b0-a69f-a83f25565205", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:06:30 compute-1 nova_compute[183403]: 2026-01-26 15:06:30.979 183407 DEBUG oslo_concurrency.lockutils [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Releasing lock "refresh_cache-99dec4e9-b3d8-43a5-ac11-01f6490a6d99" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 15:06:30 compute-1 nova_compute[183403]: 2026-01-26 15:06:30.980 183407 DEBUG nova.compute.manager [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] Instance network_info: |[{"id": "5e99f9d9-3c39-42b0-a69f-a83f25565205", "address": "fa:16:3e:4c:09:cb", "network": {"id": "32ec1104-f9cc-4957-b081-02d91f2430ae", "bridge": "br-int", "label": "tempest-TestDataModel-546723541-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3dd6a2d8596443749a561af56bd6c6e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e99f9d9-3c", "ovs_interfaceid": "5e99f9d9-3c39-42b0-a69f-a83f25565205", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Jan 26 15:06:30 compute-1 nova_compute[183403]: 2026-01-26 15:06:30.983 183407 DEBUG nova.virt.libvirt.driver [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] Start _get_guest_xml network_info=[{"id": "5e99f9d9-3c39-42b0-a69f-a83f25565205", "address": "fa:16:3e:4c:09:cb", "network": {"id": "32ec1104-f9cc-4957-b081-02d91f2430ae", "bridge": "br-int", "label": "tempest-TestDataModel-546723541-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3dd6a2d8596443749a561af56bd6c6e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e99f9d9-3c", "ovs_interfaceid": "5e99f9d9-3c39-42b0-a69f-a83f25565205", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T15:01:22Z,direct_url=<?>,disk_format='qcow2',id=354e4d0e-4287-404f-93d3-2c85cfe92fbc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='179f3c996d8f4e7ea1b0aca3ec76f02e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T15:01:23Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'image_id': '354e4d0e-4287-404f-93d3-2c85cfe92fbc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Jan 26 15:06:30 compute-1 nova_compute[183403]: 2026-01-26 15:06:30.987 183407 WARNING nova.virt.libvirt.driver [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:06:30 compute-1 nova_compute[183403]: 2026-01-26 15:06:30.989 183407 DEBUG nova.virt.driver [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='354e4d0e-4287-404f-93d3-2c85cfe92fbc', instance_meta=NovaInstanceMeta(name='tempest-TestDataModel-server-2084520715', uuid='99dec4e9-b3d8-43a5-ac11-01f6490a6d99'), owner=OwnerMeta(userid='eabbaf52ffab409ca33b5568f1dc327f', username='tempest-TestDataModel-285827473-project-admin', projectid='84dcf9273f5342fea8d5f6c33adc15c6', projectname='tempest-TestDataModel-285827473'), image=ImageMeta(id='354e4d0e-4287-404f-93d3-2c85cfe92fbc', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='74480e15-23e6-4569-8ef9-3ddf5ac8b981', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "5e99f9d9-3c39-42b0-a69f-a83f25565205", "address": "fa:16:3e:4c:09:cb", "network": {"id": "32ec1104-f9cc-4957-b081-02d91f2430ae", "bridge": "br-int", "label": "tempest-TestDataModel-546723541-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3dd6a2d8596443749a561af56bd6c6e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e99f9d9-3c", "ovs_interfaceid": "5e99f9d9-3c39-42b0-a69f-a83f25565205", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1769439990.9894261) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Jan 26 15:06:30 compute-1 nova_compute[183403]: 2026-01-26 15:06:30.993 183407 DEBUG nova.virt.libvirt.host [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Jan 26 15:06:30 compute-1 nova_compute[183403]: 2026-01-26 15:06:30.993 183407 DEBUG nova.virt.libvirt.host [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Jan 26 15:06:30 compute-1 nova_compute[183403]: 2026-01-26 15:06:30.997 183407 DEBUG nova.virt.libvirt.host [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Jan 26 15:06:30 compute-1 nova_compute[183403]: 2026-01-26 15:06:30.998 183407 DEBUG nova.virt.libvirt.host [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Jan 26 15:06:31 compute-1 nova_compute[183403]: 2026-01-26 15:06:31.000 183407 DEBUG nova.virt.libvirt.driver [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Jan 26 15:06:31 compute-1 nova_compute[183403]: 2026-01-26 15:06:31.000 183407 DEBUG nova.virt.hardware [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T15:01:21Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='74480e15-23e6-4569-8ef9-3ddf5ac8b981',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T15:01:22Z,direct_url=<?>,disk_format='qcow2',id=354e4d0e-4287-404f-93d3-2c85cfe92fbc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='179f3c996d8f4e7ea1b0aca3ec76f02e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T15:01:23Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Jan 26 15:06:31 compute-1 nova_compute[183403]: 2026-01-26 15:06:31.001 183407 DEBUG nova.virt.hardware [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Jan 26 15:06:31 compute-1 nova_compute[183403]: 2026-01-26 15:06:31.001 183407 DEBUG nova.virt.hardware [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Jan 26 15:06:31 compute-1 nova_compute[183403]: 2026-01-26 15:06:31.002 183407 DEBUG nova.virt.hardware [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Jan 26 15:06:31 compute-1 nova_compute[183403]: 2026-01-26 15:06:31.002 183407 DEBUG nova.virt.hardware [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Jan 26 15:06:31 compute-1 nova_compute[183403]: 2026-01-26 15:06:31.002 183407 DEBUG nova.virt.hardware [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Jan 26 15:06:31 compute-1 nova_compute[183403]: 2026-01-26 15:06:31.003 183407 DEBUG nova.virt.hardware [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Jan 26 15:06:31 compute-1 nova_compute[183403]: 2026-01-26 15:06:31.003 183407 DEBUG nova.virt.hardware [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Jan 26 15:06:31 compute-1 nova_compute[183403]: 2026-01-26 15:06:31.003 183407 DEBUG nova.virt.hardware [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Jan 26 15:06:31 compute-1 nova_compute[183403]: 2026-01-26 15:06:31.003 183407 DEBUG nova.virt.hardware [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Jan 26 15:06:31 compute-1 nova_compute[183403]: 2026-01-26 15:06:31.004 183407 DEBUG nova.virt.hardware [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Jan 26 15:06:31 compute-1 nova_compute[183403]: 2026-01-26 15:06:31.008 183407 DEBUG nova.privsep.utils [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.12/site-packages/nova/privsep/utils.py:63
Jan 26 15:06:31 compute-1 nova_compute[183403]: 2026-01-26 15:06:31.010 183407 DEBUG nova.virt.libvirt.vif [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-01-26T15:06:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestDataModel-server-2084520715',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testdatamodel-server-2084520715',id=3,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='84dcf9273f5342fea8d5f6c33adc15c6',ramdisk_id='',reservation_id='r-50punrgc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestDataModel-285827473',owner_user_name='tempest-TestDataModel-285827473-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T15:06:24Z,user_data=None,user_id='eabbaf52ffab409ca33b5568f1dc327f',uuid=99dec4e9-b3d8-43a5-ac11-01f6490a6d99,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5e99f9d9-3c39-42b0-a69f-a83f25565205", "address": "fa:16:3e:4c:09:cb", "network": {"id": "32ec1104-f9cc-4957-b081-02d91f2430ae", "bridge": "br-int", "label": "tempest-TestDataModel-546723541-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3dd6a2d8596443749a561af56bd6c6e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e99f9d9-3c", "ovs_interfaceid": "5e99f9d9-3c39-42b0-a69f-a83f25565205", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 26 15:06:31 compute-1 nova_compute[183403]: 2026-01-26 15:06:31.010 183407 DEBUG nova.network.os_vif_util [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Converting VIF {"id": "5e99f9d9-3c39-42b0-a69f-a83f25565205", "address": "fa:16:3e:4c:09:cb", "network": {"id": "32ec1104-f9cc-4957-b081-02d91f2430ae", "bridge": "br-int", "label": "tempest-TestDataModel-546723541-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3dd6a2d8596443749a561af56bd6c6e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e99f9d9-3c", "ovs_interfaceid": "5e99f9d9-3c39-42b0-a69f-a83f25565205", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:06:31 compute-1 nova_compute[183403]: 2026-01-26 15:06:31.011 183407 DEBUG nova.network.os_vif_util [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4c:09:cb,bridge_name='br-int',has_traffic_filtering=True,id=5e99f9d9-3c39-42b0-a69f-a83f25565205,network=Network(32ec1104-f9cc-4957-b081-02d91f2430ae),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5e99f9d9-3c') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:06:31 compute-1 nova_compute[183403]: 2026-01-26 15:06:31.013 183407 DEBUG nova.objects.instance [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Lazy-loading 'pci_devices' on Instance uuid 99dec4e9-b3d8-43a5-ac11-01f6490a6d99 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:06:31 compute-1 nova_compute[183403]: 2026-01-26 15:06:31.521 183407 DEBUG nova.virt.libvirt.driver [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] End _get_guest_xml xml=<domain type="kvm">
Jan 26 15:06:31 compute-1 nova_compute[183403]:   <uuid>99dec4e9-b3d8-43a5-ac11-01f6490a6d99</uuid>
Jan 26 15:06:31 compute-1 nova_compute[183403]:   <name>instance-00000003</name>
Jan 26 15:06:31 compute-1 nova_compute[183403]:   <memory>131072</memory>
Jan 26 15:06:31 compute-1 nova_compute[183403]:   <vcpu>1</vcpu>
Jan 26 15:06:31 compute-1 nova_compute[183403]:   <metadata>
Jan 26 15:06:31 compute-1 nova_compute[183403]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 15:06:31 compute-1 nova_compute[183403]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 15:06:31 compute-1 nova_compute[183403]:       <nova:name>tempest-TestDataModel-server-2084520715</nova:name>
Jan 26 15:06:31 compute-1 nova_compute[183403]:       <nova:creationTime>2026-01-26 15:06:30</nova:creationTime>
Jan 26 15:06:31 compute-1 nova_compute[183403]:       <nova:flavor name="m1.nano" id="74480e15-23e6-4569-8ef9-3ddf5ac8b981">
Jan 26 15:06:31 compute-1 nova_compute[183403]:         <nova:memory>128</nova:memory>
Jan 26 15:06:31 compute-1 nova_compute[183403]:         <nova:disk>1</nova:disk>
Jan 26 15:06:31 compute-1 nova_compute[183403]:         <nova:swap>0</nova:swap>
Jan 26 15:06:31 compute-1 nova_compute[183403]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 15:06:31 compute-1 nova_compute[183403]:         <nova:vcpus>1</nova:vcpus>
Jan 26 15:06:31 compute-1 nova_compute[183403]:         <nova:extraSpecs>
Jan 26 15:06:31 compute-1 nova_compute[183403]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 15:06:31 compute-1 nova_compute[183403]:         </nova:extraSpecs>
Jan 26 15:06:31 compute-1 nova_compute[183403]:       </nova:flavor>
Jan 26 15:06:31 compute-1 nova_compute[183403]:       <nova:image uuid="354e4d0e-4287-404f-93d3-2c85cfe92fbc">
Jan 26 15:06:31 compute-1 nova_compute[183403]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 15:06:31 compute-1 nova_compute[183403]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 15:06:31 compute-1 nova_compute[183403]:         <nova:minDisk>1</nova:minDisk>
Jan 26 15:06:31 compute-1 nova_compute[183403]:         <nova:minRam>0</nova:minRam>
Jan 26 15:06:31 compute-1 nova_compute[183403]:         <nova:properties>
Jan 26 15:06:31 compute-1 nova_compute[183403]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 15:06:31 compute-1 nova_compute[183403]:         </nova:properties>
Jan 26 15:06:31 compute-1 nova_compute[183403]:       </nova:image>
Jan 26 15:06:31 compute-1 nova_compute[183403]:       <nova:owner>
Jan 26 15:06:31 compute-1 nova_compute[183403]:         <nova:user uuid="eabbaf52ffab409ca33b5568f1dc327f">tempest-TestDataModel-285827473-project-admin</nova:user>
Jan 26 15:06:31 compute-1 nova_compute[183403]:         <nova:project uuid="84dcf9273f5342fea8d5f6c33adc15c6">tempest-TestDataModel-285827473</nova:project>
Jan 26 15:06:31 compute-1 nova_compute[183403]:       </nova:owner>
Jan 26 15:06:31 compute-1 nova_compute[183403]:       <nova:root type="image" uuid="354e4d0e-4287-404f-93d3-2c85cfe92fbc"/>
Jan 26 15:06:31 compute-1 nova_compute[183403]:       <nova:ports>
Jan 26 15:06:31 compute-1 nova_compute[183403]:         <nova:port uuid="5e99f9d9-3c39-42b0-a69f-a83f25565205">
Jan 26 15:06:31 compute-1 nova_compute[183403]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 26 15:06:31 compute-1 nova_compute[183403]:         </nova:port>
Jan 26 15:06:31 compute-1 nova_compute[183403]:       </nova:ports>
Jan 26 15:06:31 compute-1 nova_compute[183403]:     </nova:instance>
Jan 26 15:06:31 compute-1 nova_compute[183403]:   </metadata>
Jan 26 15:06:31 compute-1 nova_compute[183403]:   <sysinfo type="smbios">
Jan 26 15:06:31 compute-1 nova_compute[183403]:     <system>
Jan 26 15:06:31 compute-1 nova_compute[183403]:       <entry name="manufacturer">RDO</entry>
Jan 26 15:06:31 compute-1 nova_compute[183403]:       <entry name="product">OpenStack Compute</entry>
Jan 26 15:06:31 compute-1 nova_compute[183403]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 15:06:31 compute-1 nova_compute[183403]:       <entry name="serial">99dec4e9-b3d8-43a5-ac11-01f6490a6d99</entry>
Jan 26 15:06:31 compute-1 nova_compute[183403]:       <entry name="uuid">99dec4e9-b3d8-43a5-ac11-01f6490a6d99</entry>
Jan 26 15:06:31 compute-1 nova_compute[183403]:       <entry name="family">Virtual Machine</entry>
Jan 26 15:06:31 compute-1 nova_compute[183403]:     </system>
Jan 26 15:06:31 compute-1 nova_compute[183403]:   </sysinfo>
Jan 26 15:06:31 compute-1 nova_compute[183403]:   <os>
Jan 26 15:06:31 compute-1 nova_compute[183403]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 15:06:31 compute-1 nova_compute[183403]:     <boot dev="hd"/>
Jan 26 15:06:31 compute-1 nova_compute[183403]:     <smbios mode="sysinfo"/>
Jan 26 15:06:31 compute-1 nova_compute[183403]:   </os>
Jan 26 15:06:31 compute-1 nova_compute[183403]:   <features>
Jan 26 15:06:31 compute-1 nova_compute[183403]:     <acpi/>
Jan 26 15:06:31 compute-1 nova_compute[183403]:     <apic/>
Jan 26 15:06:31 compute-1 nova_compute[183403]:     <vmcoreinfo/>
Jan 26 15:06:31 compute-1 nova_compute[183403]:   </features>
Jan 26 15:06:31 compute-1 nova_compute[183403]:   <clock offset="utc">
Jan 26 15:06:31 compute-1 nova_compute[183403]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 15:06:31 compute-1 nova_compute[183403]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 15:06:31 compute-1 nova_compute[183403]:     <timer name="hpet" present="no"/>
Jan 26 15:06:31 compute-1 nova_compute[183403]:   </clock>
Jan 26 15:06:31 compute-1 nova_compute[183403]:   <cpu mode="custom" match="exact">
Jan 26 15:06:31 compute-1 nova_compute[183403]:     <model>Nehalem</model>
Jan 26 15:06:31 compute-1 nova_compute[183403]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 15:06:31 compute-1 nova_compute[183403]:   </cpu>
Jan 26 15:06:31 compute-1 nova_compute[183403]:   <devices>
Jan 26 15:06:31 compute-1 nova_compute[183403]:     <disk type="file" device="disk">
Jan 26 15:06:31 compute-1 nova_compute[183403]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 15:06:31 compute-1 nova_compute[183403]:       <source file="/var/lib/nova/instances/99dec4e9-b3d8-43a5-ac11-01f6490a6d99/disk"/>
Jan 26 15:06:31 compute-1 nova_compute[183403]:       <target dev="vda" bus="virtio"/>
Jan 26 15:06:31 compute-1 nova_compute[183403]:     </disk>
Jan 26 15:06:31 compute-1 nova_compute[183403]:     <disk type="file" device="cdrom">
Jan 26 15:06:31 compute-1 nova_compute[183403]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 15:06:31 compute-1 nova_compute[183403]:       <source file="/var/lib/nova/instances/99dec4e9-b3d8-43a5-ac11-01f6490a6d99/disk.config"/>
Jan 26 15:06:31 compute-1 nova_compute[183403]:       <target dev="sda" bus="sata"/>
Jan 26 15:06:31 compute-1 nova_compute[183403]:     </disk>
Jan 26 15:06:31 compute-1 nova_compute[183403]:     <interface type="ethernet">
Jan 26 15:06:31 compute-1 nova_compute[183403]:       <mac address="fa:16:3e:4c:09:cb"/>
Jan 26 15:06:31 compute-1 nova_compute[183403]:       <model type="virtio"/>
Jan 26 15:06:31 compute-1 nova_compute[183403]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 15:06:31 compute-1 nova_compute[183403]:       <mtu size="1442"/>
Jan 26 15:06:31 compute-1 nova_compute[183403]:       <target dev="tap5e99f9d9-3c"/>
Jan 26 15:06:31 compute-1 nova_compute[183403]:     </interface>
Jan 26 15:06:31 compute-1 nova_compute[183403]:     <serial type="pty">
Jan 26 15:06:31 compute-1 nova_compute[183403]:       <log file="/var/lib/nova/instances/99dec4e9-b3d8-43a5-ac11-01f6490a6d99/console.log" append="off"/>
Jan 26 15:06:31 compute-1 nova_compute[183403]:     </serial>
Jan 26 15:06:31 compute-1 nova_compute[183403]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 15:06:31 compute-1 nova_compute[183403]:     <video>
Jan 26 15:06:31 compute-1 nova_compute[183403]:       <model type="virtio"/>
Jan 26 15:06:31 compute-1 nova_compute[183403]:     </video>
Jan 26 15:06:31 compute-1 nova_compute[183403]:     <input type="tablet" bus="usb"/>
Jan 26 15:06:31 compute-1 nova_compute[183403]:     <rng model="virtio">
Jan 26 15:06:31 compute-1 nova_compute[183403]:       <backend model="random">/dev/urandom</backend>
Jan 26 15:06:31 compute-1 nova_compute[183403]:     </rng>
Jan 26 15:06:31 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root"/>
Jan 26 15:06:31 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:06:31 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:06:31 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:06:31 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:06:31 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:06:31 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:06:31 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:06:31 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:06:31 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:06:31 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:06:31 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:06:31 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:06:31 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:06:31 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:06:31 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:06:31 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:06:31 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:06:31 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:06:31 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:06:31 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:06:31 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:06:31 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:06:31 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:06:31 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:06:31 compute-1 nova_compute[183403]:     <controller type="usb" index="0"/>
Jan 26 15:06:31 compute-1 nova_compute[183403]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 15:06:31 compute-1 nova_compute[183403]:       <stats period="10"/>
Jan 26 15:06:31 compute-1 nova_compute[183403]:     </memballoon>
Jan 26 15:06:31 compute-1 nova_compute[183403]:   </devices>
Jan 26 15:06:31 compute-1 nova_compute[183403]: </domain>
Jan 26 15:06:31 compute-1 nova_compute[183403]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Jan 26 15:06:31 compute-1 nova_compute[183403]: 2026-01-26 15:06:31.523 183407 DEBUG nova.compute.manager [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] Preparing to wait for external event network-vif-plugged-5e99f9d9-3c39-42b0-a69f-a83f25565205 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 26 15:06:31 compute-1 nova_compute[183403]: 2026-01-26 15:06:31.523 183407 DEBUG oslo_concurrency.lockutils [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Acquiring lock "99dec4e9-b3d8-43a5-ac11-01f6490a6d99-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:06:31 compute-1 nova_compute[183403]: 2026-01-26 15:06:31.523 183407 DEBUG oslo_concurrency.lockutils [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Lock "99dec4e9-b3d8-43a5-ac11-01f6490a6d99-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:06:31 compute-1 nova_compute[183403]: 2026-01-26 15:06:31.523 183407 DEBUG oslo_concurrency.lockutils [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Lock "99dec4e9-b3d8-43a5-ac11-01f6490a6d99-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:06:31 compute-1 nova_compute[183403]: 2026-01-26 15:06:31.524 183407 DEBUG nova.virt.libvirt.vif [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-01-26T15:06:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestDataModel-server-2084520715',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testdatamodel-server-2084520715',id=3,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='84dcf9273f5342fea8d5f6c33adc15c6',ramdisk_id='',reservation_id='r-50punrgc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestDataModel-285827473',owner_user_name='tempest-TestDataModel-285827473-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T15:06:24Z,user_data=None,user_id='eabbaf52ffab409ca33b5568f1dc327f',uuid=99dec4e9-b3d8-43a5-ac11-01f6490a6d99,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5e99f9d9-3c39-42b0-a69f-a83f25565205", "address": "fa:16:3e:4c:09:cb", "network": {"id": "32ec1104-f9cc-4957-b081-02d91f2430ae", "bridge": "br-int", "label": "tempest-TestDataModel-546723541-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3dd6a2d8596443749a561af56bd6c6e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e99f9d9-3c", "ovs_interfaceid": "5e99f9d9-3c39-42b0-a69f-a83f25565205", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 26 15:06:31 compute-1 nova_compute[183403]: 2026-01-26 15:06:31.525 183407 DEBUG nova.network.os_vif_util [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Converting VIF {"id": "5e99f9d9-3c39-42b0-a69f-a83f25565205", "address": "fa:16:3e:4c:09:cb", "network": {"id": "32ec1104-f9cc-4957-b081-02d91f2430ae", "bridge": "br-int", "label": "tempest-TestDataModel-546723541-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3dd6a2d8596443749a561af56bd6c6e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e99f9d9-3c", "ovs_interfaceid": "5e99f9d9-3c39-42b0-a69f-a83f25565205", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:06:31 compute-1 nova_compute[183403]: 2026-01-26 15:06:31.525 183407 DEBUG nova.network.os_vif_util [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4c:09:cb,bridge_name='br-int',has_traffic_filtering=True,id=5e99f9d9-3c39-42b0-a69f-a83f25565205,network=Network(32ec1104-f9cc-4957-b081-02d91f2430ae),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5e99f9d9-3c') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:06:31 compute-1 nova_compute[183403]: 2026-01-26 15:06:31.526 183407 DEBUG os_vif [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4c:09:cb,bridge_name='br-int',has_traffic_filtering=True,id=5e99f9d9-3c39-42b0-a69f-a83f25565205,network=Network(32ec1104-f9cc-4957-b081-02d91f2430ae),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5e99f9d9-3c') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 26 15:06:31 compute-1 nova_compute[183403]: 2026-01-26 15:06:31.563 183407 DEBUG ovsdbapp.backend.ovs_idl [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 26 15:06:31 compute-1 nova_compute[183403]: 2026-01-26 15:06:31.564 183407 DEBUG ovsdbapp.backend.ovs_idl [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 26 15:06:31 compute-1 nova_compute[183403]: 2026-01-26 15:06:31.564 183407 DEBUG ovsdbapp.backend.ovs_idl [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 26 15:06:31 compute-1 nova_compute[183403]: 2026-01-26 15:06:31.564 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 26 15:06:31 compute-1 nova_compute[183403]: 2026-01-26 15:06:31.565 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] [POLLOUT] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:06:31 compute-1 nova_compute[183403]: 2026-01-26 15:06:31.565 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 26 15:06:31 compute-1 nova_compute[183403]: 2026-01-26 15:06:31.565 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:06:31 compute-1 nova_compute[183403]: 2026-01-26 15:06:31.567 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:06:31 compute-1 nova_compute[183403]: 2026-01-26 15:06:31.569 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:06:31 compute-1 nova_compute[183403]: 2026-01-26 15:06:31.576 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:06:31 compute-1 nova_compute[183403]: 2026-01-26 15:06:31.576 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:06:31 compute-1 nova_compute[183403]: 2026-01-26 15:06:31.577 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:06:31 compute-1 nova_compute[183403]: 2026-01-26 15:06:31.577 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:06:31 compute-1 nova_compute[183403]: 2026-01-26 15:06:31.578 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '211a0847-1e2d-54d8-8756-458d38d2f35a', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:06:31 compute-1 nova_compute[183403]: 2026-01-26 15:06:31.579 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:06:31 compute-1 nova_compute[183403]: 2026-01-26 15:06:31.580 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:06:31 compute-1 nova_compute[183403]: 2026-01-26 15:06:31.581 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:06:31 compute-1 nova_compute[183403]: 2026-01-26 15:06:31.582 183407 INFO oslo.privsep.daemon [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpcpsxiddm/privsep.sock']
Jan 26 15:06:32 compute-1 nova_compute[183403]: 2026-01-26 15:06:32.563 183407 INFO oslo.privsep.daemon [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Spawned new privsep daemon via rootwrap
Jan 26 15:06:32 compute-1 nova_compute[183403]: 2026-01-26 15:06:32.198 204607 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 26 15:06:32 compute-1 nova_compute[183403]: 2026-01-26 15:06:32.202 204607 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 26 15:06:32 compute-1 nova_compute[183403]: 2026-01-26 15:06:32.206 204607 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Jan 26 15:06:32 compute-1 nova_compute[183403]: 2026-01-26 15:06:32.206 204607 INFO oslo.privsep.daemon [-] privsep daemon running as pid 204607
Jan 26 15:06:32 compute-1 nova_compute[183403]: 2026-01-26 15:06:32.806 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:06:32 compute-1 nova_compute[183403]: 2026-01-26 15:06:32.807 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5e99f9d9-3c, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:06:32 compute-1 nova_compute[183403]: 2026-01-26 15:06:32.807 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap5e99f9d9-3c, col_values=(('qos', UUID('e13c792c-797c-43ce-9c6e-e86f75b1d6cf')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:06:32 compute-1 nova_compute[183403]: 2026-01-26 15:06:32.809 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap5e99f9d9-3c, col_values=(('external_ids', {'iface-id': '5e99f9d9-3c39-42b0-a69f-a83f25565205', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4c:09:cb', 'vm-uuid': '99dec4e9-b3d8-43a5-ac11-01f6490a6d99'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:06:32 compute-1 nova_compute[183403]: 2026-01-26 15:06:32.853 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:06:32 compute-1 NetworkManager[55716]: <info>  [1769439992.8545] manager: (tap5e99f9d9-3c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Jan 26 15:06:32 compute-1 nova_compute[183403]: 2026-01-26 15:06:32.855 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:06:32 compute-1 nova_compute[183403]: 2026-01-26 15:06:32.859 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:06:32 compute-1 nova_compute[183403]: 2026-01-26 15:06:32.860 183407 INFO os_vif [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4c:09:cb,bridge_name='br-int',has_traffic_filtering=True,id=5e99f9d9-3c39-42b0-a69f-a83f25565205,network=Network(32ec1104-f9cc-4957-b081-02d91f2430ae),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5e99f9d9-3c')
Jan 26 15:06:33 compute-1 nova_compute[183403]: 2026-01-26 15:06:33.309 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:06:34 compute-1 sshd-session[204589]: Invalid user ftpguest from 185.246.128.170 port 25773
Jan 26 15:06:34 compute-1 nova_compute[183403]: 2026-01-26 15:06:34.577 183407 DEBUG nova.virt.libvirt.driver [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 15:06:34 compute-1 nova_compute[183403]: 2026-01-26 15:06:34.577 183407 DEBUG nova.virt.libvirt.driver [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 15:06:34 compute-1 nova_compute[183403]: 2026-01-26 15:06:34.577 183407 DEBUG nova.virt.libvirt.driver [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] No VIF found with MAC fa:16:3e:4c:09:cb, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Jan 26 15:06:34 compute-1 nova_compute[183403]: 2026-01-26 15:06:34.578 183407 INFO nova.virt.libvirt.driver [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] Using config drive
Jan 26 15:06:35 compute-1 nova_compute[183403]: 2026-01-26 15:06:35.219 183407 WARNING neutronclient.v2_0.client [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:06:35 compute-1 podman[192725]: time="2026-01-26T15:06:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:06:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:06:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:06:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:06:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2156 "" "Go-http-client/1.1"
Jan 26 15:06:36 compute-1 nova_compute[183403]: 2026-01-26 15:06:36.071 183407 INFO nova.virt.libvirt.driver [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] Creating config drive at /var/lib/nova/instances/99dec4e9-b3d8-43a5-ac11-01f6490a6d99/disk.config
Jan 26 15:06:36 compute-1 nova_compute[183403]: 2026-01-26 15:06:36.075 183407 DEBUG oslo_concurrency.processutils [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/99dec4e9-b3d8-43a5-ac11-01f6490a6d99/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpi99tpq5z execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:06:36 compute-1 nova_compute[183403]: 2026-01-26 15:06:36.198 183407 DEBUG oslo_concurrency.processutils [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/99dec4e9-b3d8-43a5-ac11-01f6490a6d99/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpi99tpq5z" returned: 0 in 0.123s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:06:36 compute-1 kernel: tun: Universal TUN/TAP device driver, 1.6
Jan 26 15:06:36 compute-1 kernel: tap5e99f9d9-3c: entered promiscuous mode
Jan 26 15:06:36 compute-1 NetworkManager[55716]: <info>  [1769439996.2891] manager: (tap5e99f9d9-3c): new Tun device (/org/freedesktop/NetworkManager/Devices/21)
Jan 26 15:06:36 compute-1 ovn_controller[95641]: 2026-01-26T15:06:36Z|00040|binding|INFO|Claiming lport 5e99f9d9-3c39-42b0-a69f-a83f25565205 for this chassis.
Jan 26 15:06:36 compute-1 ovn_controller[95641]: 2026-01-26T15:06:36Z|00041|binding|INFO|5e99f9d9-3c39-42b0-a69f-a83f25565205: Claiming fa:16:3e:4c:09:cb 10.100.0.13
Jan 26 15:06:36 compute-1 nova_compute[183403]: 2026-01-26 15:06:36.291 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:06:36 compute-1 nova_compute[183403]: 2026-01-26 15:06:36.297 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:06:36 compute-1 systemd-udevd[204634]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 15:06:36 compute-1 NetworkManager[55716]: <info>  [1769439996.3428] device (tap5e99f9d9-3c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 15:06:36 compute-1 NetworkManager[55716]: <info>  [1769439996.3438] device (tap5e99f9d9-3c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 15:06:36 compute-1 systemd-machined[154697]: New machine qemu-1-instance-00000003.
Jan 26 15:06:36 compute-1 nova_compute[183403]: 2026-01-26 15:06:36.359 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:06:36 compute-1 ovn_controller[95641]: 2026-01-26T15:06:36Z|00042|binding|INFO|Setting lport 5e99f9d9-3c39-42b0-a69f-a83f25565205 ovn-installed in OVS
Jan 26 15:06:36 compute-1 nova_compute[183403]: 2026-01-26 15:06:36.364 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:06:36 compute-1 systemd[1]: Started Virtual Machine qemu-1-instance-00000003.
Jan 26 15:06:36 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:36.369 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4c:09:cb 10.100.0.13'], port_security=['fa:16:3e:4c:09:cb 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '99dec4e9-b3d8-43a5-ac11-01f6490a6d99', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-32ec1104-f9cc-4957-b081-02d91f2430ae', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '84dcf9273f5342fea8d5f6c33adc15c6', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e10eb692-4579-4957-a822-71520dd9f30a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=081594a4-a28d-46d7-a62f-5448181a5a8e, chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], logical_port=5e99f9d9-3c39-42b0-a69f-a83f25565205) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:06:36 compute-1 ovn_controller[95641]: 2026-01-26T15:06:36Z|00043|binding|INFO|Setting lport 5e99f9d9-3c39-42b0-a69f-a83f25565205 up in Southbound
Jan 26 15:06:36 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:36.370 104930 INFO neutron.agent.ovn.metadata.agent [-] Port 5e99f9d9-3c39-42b0-a69f-a83f25565205 in datapath 32ec1104-f9cc-4957-b081-02d91f2430ae bound to our chassis
Jan 26 15:06:36 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:36.371 104930 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 32ec1104-f9cc-4957-b081-02d91f2430ae
Jan 26 15:06:36 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:36.394 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[79c07389-5bcf-4716-b202-9e7cb6a87e0f]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:06:36 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:36.395 104930 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap32ec1104-f1 in ovnmeta-32ec1104-f9cc-4957-b081-02d91f2430ae namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Jan 26 15:06:36 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:36.399 203506 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap32ec1104-f0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Jan 26 15:06:36 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:36.399 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[6ec9bab5-b66b-40f2-9fb1-4eb682fc5859]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:06:36 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:36.400 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[0ecd06d9-027d-4214-9460-4255f65d2b23]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:06:36 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:36.417 105448 DEBUG oslo.privsep.daemon [-] privsep: reply[2a8d3b34-c439-444e-ba12-48c1b75644f9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:06:36 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:36.425 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[90209c22-98b7-4d34-b9ab-d09f469ef405]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:06:36 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:36.427 104930 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmp75445j30/privsep.sock']
Jan 26 15:06:36 compute-1 sshd-session[204589]: Disconnecting invalid user ftpguest 185.246.128.170 port 25773: Change of username or service not allowed: (ftpguest,ssh-connection) -> (Cisco,ssh-connection) [preauth]
Jan 26 15:06:37 compute-1 nova_compute[183403]: 2026-01-26 15:06:37.162 183407 DEBUG nova.compute.manager [req-44ae8cce-56db-4a39-b376-7d6492335457 req-7fc12086-50c2-4379-892c-241760b45fb0 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] Received event network-vif-plugged-5e99f9d9-3c39-42b0-a69f-a83f25565205 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:06:37 compute-1 nova_compute[183403]: 2026-01-26 15:06:37.164 183407 DEBUG oslo_concurrency.lockutils [req-44ae8cce-56db-4a39-b376-7d6492335457 req-7fc12086-50c2-4379-892c-241760b45fb0 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "99dec4e9-b3d8-43a5-ac11-01f6490a6d99-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:06:37 compute-1 nova_compute[183403]: 2026-01-26 15:06:37.165 183407 DEBUG oslo_concurrency.lockutils [req-44ae8cce-56db-4a39-b376-7d6492335457 req-7fc12086-50c2-4379-892c-241760b45fb0 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "99dec4e9-b3d8-43a5-ac11-01f6490a6d99-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:06:37 compute-1 nova_compute[183403]: 2026-01-26 15:06:37.166 183407 DEBUG oslo_concurrency.lockutils [req-44ae8cce-56db-4a39-b376-7d6492335457 req-7fc12086-50c2-4379-892c-241760b45fb0 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "99dec4e9-b3d8-43a5-ac11-01f6490a6d99-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:06:37 compute-1 nova_compute[183403]: 2026-01-26 15:06:37.166 183407 DEBUG nova.compute.manager [req-44ae8cce-56db-4a39-b376-7d6492335457 req-7fc12086-50c2-4379-892c-241760b45fb0 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] Processing event network-vif-plugged-5e99f9d9-3c39-42b0-a69f-a83f25565205 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 26 15:06:37 compute-1 nova_compute[183403]: 2026-01-26 15:06:37.168 183407 DEBUG nova.compute.manager [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 26 15:06:37 compute-1 nova_compute[183403]: 2026-01-26 15:06:37.172 183407 DEBUG nova.virt.libvirt.driver [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Jan 26 15:06:37 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:37.182 104930 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Jan 26 15:06:37 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:37.182 104930 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp75445j30/privsep.sock __init__ /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:377
Jan 26 15:06:37 compute-1 nova_compute[183403]: 2026-01-26 15:06:37.182 183407 INFO nova.virt.libvirt.driver [-] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] Instance spawned successfully.
Jan 26 15:06:37 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:37.028 204665 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 26 15:06:37 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:37.031 204665 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 26 15:06:37 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:37.033 204665 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Jan 26 15:06:37 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:37.033 204665 INFO oslo.privsep.daemon [-] privsep daemon running as pid 204665
Jan 26 15:06:37 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:37.183 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[09fe0d88-e931-49c4-91d3-9aeac8a52262]: (2,) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:06:37 compute-1 nova_compute[183403]: 2026-01-26 15:06:37.183 183407 DEBUG nova.virt.libvirt.driver [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Jan 26 15:06:37 compute-1 nova_compute[183403]: 2026-01-26 15:06:37.495 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:06:37 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:37.496 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:ac:38', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'de:96:40:90:f3:e5'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:06:37 compute-1 nova_compute[183403]: 2026-01-26 15:06:37.515 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:06:37 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:37.838 204665 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:06:37 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:37.839 204665 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:06:37 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:37.839 204665 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:06:37 compute-1 nova_compute[183403]: 2026-01-26 15:06:37.853 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:06:38 compute-1 nova_compute[183403]: 2026-01-26 15:06:38.039 183407 DEBUG nova.virt.libvirt.driver [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:06:38 compute-1 nova_compute[183403]: 2026-01-26 15:06:38.039 183407 DEBUG nova.virt.libvirt.driver [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:06:38 compute-1 nova_compute[183403]: 2026-01-26 15:06:38.040 183407 DEBUG nova.virt.libvirt.driver [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:06:38 compute-1 nova_compute[183403]: 2026-01-26 15:06:38.040 183407 DEBUG nova.virt.libvirt.driver [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:06:38 compute-1 nova_compute[183403]: 2026-01-26 15:06:38.041 183407 DEBUG nova.virt.libvirt.driver [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:06:38 compute-1 nova_compute[183403]: 2026-01-26 15:06:38.041 183407 DEBUG nova.virt.libvirt.driver [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:06:38 compute-1 nova_compute[183403]: 2026-01-26 15:06:38.126 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Triggering sync for uuid 99dec4e9-b3d8-43a5-ac11-01f6490a6d99 _sync_power_states /usr/lib/python3.12/site-packages/nova/compute/manager.py:11024
Jan 26 15:06:38 compute-1 nova_compute[183403]: 2026-01-26 15:06:38.127 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "99dec4e9-b3d8-43a5-ac11-01f6490a6d99" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:06:38 compute-1 nova_compute[183403]: 2026-01-26 15:06:38.311 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:06:38 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:38.339 204665 INFO oslo_service.backend [-] Loading backend: eventlet
Jan 26 15:06:38 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:38.344 204665 INFO oslo_service.backend [-] Backend 'eventlet' successfully loaded and cached.
Jan 26 15:06:38 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:38.423 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[e9497622-55de-4c31-aa78-167fa5d030e6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:06:38 compute-1 NetworkManager[55716]: <info>  [1769439998.4796] manager: (tap32ec1104-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/22)
Jan 26 15:06:38 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:38.479 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[50445551-e7fa-43fe-8ce0-40c7fbc3a101]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:06:38 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:38.527 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[abb79558-0cec-48af-b392-4d46d6bfa6e4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:06:38 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:38.530 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[f53d2b72-2efb-49b8-a936-222c5f01e82d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:06:38 compute-1 NetworkManager[55716]: <info>  [1769439998.5576] device (tap32ec1104-f0): carrier: link connected
Jan 26 15:06:38 compute-1 nova_compute[183403]: 2026-01-26 15:06:38.559 183407 INFO nova.compute.manager [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] Took 12.91 seconds to spawn the instance on the hypervisor.
Jan 26 15:06:38 compute-1 nova_compute[183403]: 2026-01-26 15:06:38.560 183407 DEBUG nova.compute.manager [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Jan 26 15:06:38 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:38.564 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[25c35a40-d371-4100-ad15-1743c8681b5f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:06:38 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:38.578 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[8506386f-7d1e-4f78-a520-8b1bcde18cc0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap32ec1104-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:71:4e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372116, 'reachable_time': 20019, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 204692, 'error': None, 'target': 'ovnmeta-32ec1104-f9cc-4957-b081-02d91f2430ae', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:06:38 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:38.591 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[7bb90241-c3b4-437b-9a4a-cb0fadcecb36]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8f:714e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372116, 'tstamp': 372116}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 204693, 'error': None, 'target': 'ovnmeta-32ec1104-f9cc-4957-b081-02d91f2430ae', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:06:38 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:38.610 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[329f5fa9-4e53-4ad7-9893-7cb3a02e47bc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap32ec1104-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:71:4e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372116, 'reachable_time': 20019, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 204694, 'error': None, 'target': 'ovnmeta-32ec1104-f9cc-4957-b081-02d91f2430ae', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:06:38 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:38.650 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[a97b4ab1-51eb-4393-942f-fd437192614f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:06:38 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:38.724 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[3f7bdad9-cad5-4113-a210-6e5a00ce94aa]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:06:38 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:38.725 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap32ec1104-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:06:38 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:38.725 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:06:38 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:38.726 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap32ec1104-f0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:06:38 compute-1 nova_compute[183403]: 2026-01-26 15:06:38.728 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:06:38 compute-1 NetworkManager[55716]: <info>  [1769439998.7299] manager: (tap32ec1104-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/23)
Jan 26 15:06:38 compute-1 kernel: tap32ec1104-f0: entered promiscuous mode
Jan 26 15:06:38 compute-1 nova_compute[183403]: 2026-01-26 15:06:38.732 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:06:38 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:38.736 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap32ec1104-f0, col_values=(('external_ids', {'iface-id': 'a8d358b6-6e12-4dfe-8bd4-900037efed58'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:06:38 compute-1 ovn_controller[95641]: 2026-01-26T15:06:38Z|00044|binding|INFO|Releasing lport a8d358b6-6e12-4dfe-8bd4-900037efed58 from this chassis (sb_readonly=0)
Jan 26 15:06:38 compute-1 nova_compute[183403]: 2026-01-26 15:06:38.738 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:06:38 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:38.740 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[744b8d1e-fbaa-42a9-b0d2-b4a830ff8646]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:06:38 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:38.741 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/32ec1104-f9cc-4957-b081-02d91f2430ae.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/32ec1104-f9cc-4957-b081-02d91f2430ae.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:06:38 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:38.741 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/32ec1104-f9cc-4957-b081-02d91f2430ae.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/32ec1104-f9cc-4957-b081-02d91f2430ae.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:06:38 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:38.741 104930 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 32ec1104-f9cc-4957-b081-02d91f2430ae disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Jan 26 15:06:38 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:38.741 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/32ec1104-f9cc-4957-b081-02d91f2430ae.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/32ec1104-f9cc-4957-b081-02d91f2430ae.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:06:38 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:38.742 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[3a5ef318-90f4-488e-b506-a955e8b61d55]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:06:38 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:38.763 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/32ec1104-f9cc-4957-b081-02d91f2430ae.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/32ec1104-f9cc-4957-b081-02d91f2430ae.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:06:38 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:38.763 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[a13297dc-6c4a-41da-8982-f2073cb60ab5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:06:38 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:38.764 104930 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Jan 26 15:06:38 compute-1 ovn_metadata_agent[104924]: global
Jan 26 15:06:38 compute-1 ovn_metadata_agent[104924]:     log         /dev/log local0 debug
Jan 26 15:06:38 compute-1 ovn_metadata_agent[104924]:     log-tag     haproxy-metadata-proxy-32ec1104-f9cc-4957-b081-02d91f2430ae
Jan 26 15:06:38 compute-1 ovn_metadata_agent[104924]:     user        root
Jan 26 15:06:38 compute-1 ovn_metadata_agent[104924]:     group       root
Jan 26 15:06:38 compute-1 ovn_metadata_agent[104924]:     maxconn     1024
Jan 26 15:06:38 compute-1 ovn_metadata_agent[104924]:     pidfile     /var/lib/neutron/external/pids/32ec1104-f9cc-4957-b081-02d91f2430ae.pid.haproxy
Jan 26 15:06:38 compute-1 ovn_metadata_agent[104924]:     daemon
Jan 26 15:06:38 compute-1 ovn_metadata_agent[104924]: 
Jan 26 15:06:38 compute-1 ovn_metadata_agent[104924]: defaults
Jan 26 15:06:38 compute-1 ovn_metadata_agent[104924]:     log global
Jan 26 15:06:38 compute-1 ovn_metadata_agent[104924]:     mode http
Jan 26 15:06:38 compute-1 ovn_metadata_agent[104924]:     option httplog
Jan 26 15:06:38 compute-1 ovn_metadata_agent[104924]:     option dontlognull
Jan 26 15:06:38 compute-1 ovn_metadata_agent[104924]:     option http-server-close
Jan 26 15:06:38 compute-1 ovn_metadata_agent[104924]:     option forwardfor
Jan 26 15:06:38 compute-1 ovn_metadata_agent[104924]:     retries                 3
Jan 26 15:06:38 compute-1 ovn_metadata_agent[104924]:     timeout http-request    30s
Jan 26 15:06:38 compute-1 ovn_metadata_agent[104924]:     timeout connect         30s
Jan 26 15:06:38 compute-1 ovn_metadata_agent[104924]:     timeout client          32s
Jan 26 15:06:38 compute-1 ovn_metadata_agent[104924]:     timeout server          32s
Jan 26 15:06:38 compute-1 ovn_metadata_agent[104924]:     timeout http-keep-alive 30s
Jan 26 15:06:38 compute-1 ovn_metadata_agent[104924]: 
Jan 26 15:06:38 compute-1 ovn_metadata_agent[104924]: listen listener
Jan 26 15:06:38 compute-1 ovn_metadata_agent[104924]:     bind 169.254.169.254:80
Jan 26 15:06:38 compute-1 ovn_metadata_agent[104924]:     
Jan 26 15:06:38 compute-1 ovn_metadata_agent[104924]:     server metadata /var/lib/neutron/metadata_proxy
Jan 26 15:06:38 compute-1 ovn_metadata_agent[104924]: 
Jan 26 15:06:38 compute-1 ovn_metadata_agent[104924]:     http-request add-header X-OVN-Network-ID 32ec1104-f9cc-4957-b081-02d91f2430ae
Jan 26 15:06:38 compute-1 ovn_metadata_agent[104924]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Jan 26 15:06:38 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:38.766 104930 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-32ec1104-f9cc-4957-b081-02d91f2430ae', 'env', 'PROCESS_TAG=haproxy-32ec1104-f9cc-4957-b081-02d91f2430ae', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/32ec1104-f9cc-4957-b081-02d91f2430ae.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Jan 26 15:06:38 compute-1 nova_compute[183403]: 2026-01-26 15:06:38.766 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:06:39 compute-1 nova_compute[183403]: 2026-01-26 15:06:39.125 183407 INFO nova.compute.manager [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] Took 18.26 seconds to build instance.
Jan 26 15:06:39 compute-1 nova_compute[183403]: 2026-01-26 15:06:39.290 183407 DEBUG nova.compute.manager [req-a5ebfc5a-4656-49a7-beef-4f12a8745c71 req-4bace237-9d43-4bcc-a1fc-55ea3d1b56f4 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] Received event network-vif-plugged-5e99f9d9-3c39-42b0-a69f-a83f25565205 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:06:39 compute-1 nova_compute[183403]: 2026-01-26 15:06:39.290 183407 DEBUG oslo_concurrency.lockutils [req-a5ebfc5a-4656-49a7-beef-4f12a8745c71 req-4bace237-9d43-4bcc-a1fc-55ea3d1b56f4 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "99dec4e9-b3d8-43a5-ac11-01f6490a6d99-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:06:39 compute-1 nova_compute[183403]: 2026-01-26 15:06:39.291 183407 DEBUG oslo_concurrency.lockutils [req-a5ebfc5a-4656-49a7-beef-4f12a8745c71 req-4bace237-9d43-4bcc-a1fc-55ea3d1b56f4 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "99dec4e9-b3d8-43a5-ac11-01f6490a6d99-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:06:39 compute-1 nova_compute[183403]: 2026-01-26 15:06:39.291 183407 DEBUG oslo_concurrency.lockutils [req-a5ebfc5a-4656-49a7-beef-4f12a8745c71 req-4bace237-9d43-4bcc-a1fc-55ea3d1b56f4 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "99dec4e9-b3d8-43a5-ac11-01f6490a6d99-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:06:39 compute-1 nova_compute[183403]: 2026-01-26 15:06:39.292 183407 DEBUG nova.compute.manager [req-a5ebfc5a-4656-49a7-beef-4f12a8745c71 req-4bace237-9d43-4bcc-a1fc-55ea3d1b56f4 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] No waiting events found dispatching network-vif-plugged-5e99f9d9-3c39-42b0-a69f-a83f25565205 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:06:39 compute-1 nova_compute[183403]: 2026-01-26 15:06:39.292 183407 WARNING nova.compute.manager [req-a5ebfc5a-4656-49a7-beef-4f12a8745c71 req-4bace237-9d43-4bcc-a1fc-55ea3d1b56f4 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] Received unexpected event network-vif-plugged-5e99f9d9-3c39-42b0-a69f-a83f25565205 for instance with vm_state active and task_state None.
Jan 26 15:06:39 compute-1 podman[204727]: 2026-01-26 15:06:39.305253816 +0000 UTC m=+0.062793874 container create ab68bfceb855bafb798c7cc0e0874470d9f9a0d93ff542d6fe235f3154507282 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-32ec1104-f9cc-4957-b081-02d91f2430ae, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team)
Jan 26 15:06:39 compute-1 systemd[1]: Started libpod-conmon-ab68bfceb855bafb798c7cc0e0874470d9f9a0d93ff542d6fe235f3154507282.scope.
Jan 26 15:06:39 compute-1 podman[204727]: 2026-01-26 15:06:39.275439538 +0000 UTC m=+0.032979616 image pull d5bf96c5225682608353c2a38183b39c74c7c48343b54a579b3b6f3d81996637 38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Jan 26 15:06:39 compute-1 systemd[1]: Started libcrun container.
Jan 26 15:06:39 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8f9717fa23f0f34342d009e3b91591795e18bc776e249450cc7ee1627a5557b3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 15:06:39 compute-1 podman[204727]: 2026-01-26 15:06:39.394591728 +0000 UTC m=+0.152131816 container init ab68bfceb855bafb798c7cc0e0874470d9f9a0d93ff542d6fe235f3154507282 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-32ec1104-f9cc-4957-b081-02d91f2430ae, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest)
Jan 26 15:06:39 compute-1 podman[204727]: 2026-01-26 15:06:39.403658039 +0000 UTC m=+0.161198097 container start ab68bfceb855bafb798c7cc0e0874470d9f9a0d93ff542d6fe235f3154507282 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-32ec1104-f9cc-4957-b081-02d91f2430ae, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS)
Jan 26 15:06:39 compute-1 neutron-haproxy-ovnmeta-32ec1104-f9cc-4957-b081-02d91f2430ae[204742]: [NOTICE]   (204746) : New worker (204748) forked
Jan 26 15:06:39 compute-1 neutron-haproxy-ovnmeta-32ec1104-f9cc-4957-b081-02d91f2430ae[204742]: [NOTICE]   (204746) : Loading success.
Jan 26 15:06:39 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:39.467 104930 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 15:06:39 compute-1 nova_compute[183403]: 2026-01-26 15:06:39.632 183407 DEBUG oslo_concurrency.lockutils [None req-f01d4d40-f1a7-4406-a767-e65030cdd9d9 eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Lock "99dec4e9-b3d8-43a5-ac11-01f6490a6d99" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.783s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:06:39 compute-1 nova_compute[183403]: 2026-01-26 15:06:39.633 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "99dec4e9-b3d8-43a5-ac11-01f6490a6d99" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 1.506s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:06:39 compute-1 nova_compute[183403]: 2026-01-26 15:06:39.633 183407 INFO nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 26 15:06:39 compute-1 nova_compute[183403]: 2026-01-26 15:06:39.634 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "99dec4e9-b3d8-43a5-ac11-01f6490a6d99" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:06:41 compute-1 podman[204759]: 2026-01-26 15:06:41.121075008 +0000 UTC m=+0.103631410 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 26 15:06:41 compute-1 podman[204760]: 2026-01-26 15:06:41.149470511 +0000 UTC m=+0.126988829 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, name=ubi9-minimal, managed_by=edpm_ansible, version=9.6, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, config_id=openstack_network_exporter, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 26 15:06:42 compute-1 nova_compute[183403]: 2026-01-26 15:06:42.857 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:06:42 compute-1 sshd-session[204758]: Invalid user Cisco from 185.246.128.170 port 39725
Jan 26 15:06:43 compute-1 sshd-session[204758]: Disconnecting invalid user Cisco 185.246.128.170 port 39725: Change of username or service not allowed: (Cisco,ssh-connection) -> (huawei,ssh-connection) [preauth]
Jan 26 15:06:43 compute-1 nova_compute[183403]: 2026-01-26 15:06:43.314 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:06:44 compute-1 nova_compute[183403]: 2026-01-26 15:06:44.787 183407 DEBUG oslo_concurrency.lockutils [None req-9bbecb5d-5b47-4e41-b217-7c56446b05dc eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Acquiring lock "99dec4e9-b3d8-43a5-ac11-01f6490a6d99" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:06:44 compute-1 nova_compute[183403]: 2026-01-26 15:06:44.788 183407 DEBUG oslo_concurrency.lockutils [None req-9bbecb5d-5b47-4e41-b217-7c56446b05dc eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Lock "99dec4e9-b3d8-43a5-ac11-01f6490a6d99" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:06:44 compute-1 nova_compute[183403]: 2026-01-26 15:06:44.789 183407 DEBUG oslo_concurrency.lockutils [None req-9bbecb5d-5b47-4e41-b217-7c56446b05dc eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Acquiring lock "99dec4e9-b3d8-43a5-ac11-01f6490a6d99-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:06:44 compute-1 nova_compute[183403]: 2026-01-26 15:06:44.790 183407 DEBUG oslo_concurrency.lockutils [None req-9bbecb5d-5b47-4e41-b217-7c56446b05dc eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Lock "99dec4e9-b3d8-43a5-ac11-01f6490a6d99-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:06:44 compute-1 nova_compute[183403]: 2026-01-26 15:06:44.790 183407 DEBUG oslo_concurrency.lockutils [None req-9bbecb5d-5b47-4e41-b217-7c56446b05dc eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Lock "99dec4e9-b3d8-43a5-ac11-01f6490a6d99-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:06:44 compute-1 nova_compute[183403]: 2026-01-26 15:06:44.802 183407 INFO nova.compute.manager [None req-9bbecb5d-5b47-4e41-b217-7c56446b05dc eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] Terminating instance
Jan 26 15:06:45 compute-1 nova_compute[183403]: 2026-01-26 15:06:45.318 183407 DEBUG nova.compute.manager [None req-9bbecb5d-5b47-4e41-b217-7c56446b05dc eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Jan 26 15:06:45 compute-1 kernel: tap5e99f9d9-3c (unregistering): left promiscuous mode
Jan 26 15:06:45 compute-1 NetworkManager[55716]: <info>  [1769440005.3428] device (tap5e99f9d9-3c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 15:06:45 compute-1 ovn_controller[95641]: 2026-01-26T15:06:45Z|00045|binding|INFO|Releasing lport 5e99f9d9-3c39-42b0-a69f-a83f25565205 from this chassis (sb_readonly=0)
Jan 26 15:06:45 compute-1 ovn_controller[95641]: 2026-01-26T15:06:45Z|00046|binding|INFO|Setting lport 5e99f9d9-3c39-42b0-a69f-a83f25565205 down in Southbound
Jan 26 15:06:45 compute-1 nova_compute[183403]: 2026-01-26 15:06:45.351 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:06:45 compute-1 nova_compute[183403]: 2026-01-26 15:06:45.353 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:06:45 compute-1 ovn_controller[95641]: 2026-01-26T15:06:45Z|00047|binding|INFO|Removing iface tap5e99f9d9-3c ovn-installed in OVS
Jan 26 15:06:45 compute-1 nova_compute[183403]: 2026-01-26 15:06:45.355 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:06:45 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:45.360 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4c:09:cb 10.100.0.13'], port_security=['fa:16:3e:4c:09:cb 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '99dec4e9-b3d8-43a5-ac11-01f6490a6d99', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-32ec1104-f9cc-4957-b081-02d91f2430ae', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '84dcf9273f5342fea8d5f6c33adc15c6', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'e10eb692-4579-4957-a822-71520dd9f30a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=081594a4-a28d-46d7-a62f-5448181a5a8e, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], logical_port=5e99f9d9-3c39-42b0-a69f-a83f25565205) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:06:45 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:45.362 104930 INFO neutron.agent.ovn.metadata.agent [-] Port 5e99f9d9-3c39-42b0-a69f-a83f25565205 in datapath 32ec1104-f9cc-4957-b081-02d91f2430ae unbound from our chassis
Jan 26 15:06:45 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:45.364 104930 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 32ec1104-f9cc-4957-b081-02d91f2430ae, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 15:06:45 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:45.364 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[2a6a669d-d024-4ecf-906a-f1b2f10c189b]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:06:45 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:45.365 104930 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-32ec1104-f9cc-4957-b081-02d91f2430ae namespace which is not needed anymore
Jan 26 15:06:45 compute-1 nova_compute[183403]: 2026-01-26 15:06:45.367 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:06:45 compute-1 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000003.scope: Deactivated successfully.
Jan 26 15:06:45 compute-1 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000003.scope: Consumed 8.625s CPU time.
Jan 26 15:06:45 compute-1 systemd-machined[154697]: Machine qemu-1-instance-00000003 terminated.
Jan 26 15:06:45 compute-1 neutron-haproxy-ovnmeta-32ec1104-f9cc-4957-b081-02d91f2430ae[204742]: [NOTICE]   (204746) : haproxy version is 3.0.5-8e879a5
Jan 26 15:06:45 compute-1 neutron-haproxy-ovnmeta-32ec1104-f9cc-4957-b081-02d91f2430ae[204742]: [NOTICE]   (204746) : path to executable is /usr/sbin/haproxy
Jan 26 15:06:45 compute-1 neutron-haproxy-ovnmeta-32ec1104-f9cc-4957-b081-02d91f2430ae[204742]: [WARNING]  (204746) : Exiting Master process...
Jan 26 15:06:45 compute-1 podman[204833]: 2026-01-26 15:06:45.478388112 +0000 UTC m=+0.026664964 container kill ab68bfceb855bafb798c7cc0e0874470d9f9a0d93ff542d6fe235f3154507282 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-32ec1104-f9cc-4957-b081-02d91f2430ae, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Jan 26 15:06:45 compute-1 neutron-haproxy-ovnmeta-32ec1104-f9cc-4957-b081-02d91f2430ae[204742]: [ALERT]    (204746) : Current worker (204748) exited with code 143 (Terminated)
Jan 26 15:06:45 compute-1 neutron-haproxy-ovnmeta-32ec1104-f9cc-4957-b081-02d91f2430ae[204742]: [WARNING]  (204746) : All workers exited. Exiting... (0)
Jan 26 15:06:45 compute-1 systemd[1]: libpod-ab68bfceb855bafb798c7cc0e0874470d9f9a0d93ff542d6fe235f3154507282.scope: Deactivated successfully.
Jan 26 15:06:45 compute-1 nova_compute[183403]: 2026-01-26 15:06:45.555 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:06:45 compute-1 nova_compute[183403]: 2026-01-26 15:06:45.560 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:06:45 compute-1 nova_compute[183403]: 2026-01-26 15:06:45.581 183407 INFO nova.virt.libvirt.driver [-] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] Instance destroyed successfully.
Jan 26 15:06:45 compute-1 nova_compute[183403]: 2026-01-26 15:06:45.582 183407 DEBUG nova.objects.instance [None req-9bbecb5d-5b47-4e41-b217-7c56446b05dc eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Lazy-loading 'resources' on Instance uuid 99dec4e9-b3d8-43a5-ac11-01f6490a6d99 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:06:45 compute-1 podman[204849]: 2026-01-26 15:06:45.705554732 +0000 UTC m=+0.202339191 container died ab68bfceb855bafb798c7cc0e0874470d9f9a0d93ff542d6fe235f3154507282 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-32ec1104-f9cc-4957-b081-02d91f2430ae, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 26 15:06:45 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ab68bfceb855bafb798c7cc0e0874470d9f9a0d93ff542d6fe235f3154507282-userdata-shm.mount: Deactivated successfully.
Jan 26 15:06:45 compute-1 systemd[1]: var-lib-containers-storage-overlay-8f9717fa23f0f34342d009e3b91591795e18bc776e249450cc7ee1627a5557b3-merged.mount: Deactivated successfully.
Jan 26 15:06:45 compute-1 podman[204849]: 2026-01-26 15:06:45.760561001 +0000 UTC m=+0.257345400 container cleanup ab68bfceb855bafb798c7cc0e0874470d9f9a0d93ff542d6fe235f3154507282 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-32ec1104-f9cc-4957-b081-02d91f2430ae, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260120)
Jan 26 15:06:45 compute-1 systemd[1]: libpod-conmon-ab68bfceb855bafb798c7cc0e0874470d9f9a0d93ff542d6fe235f3154507282.scope: Deactivated successfully.
Jan 26 15:06:45 compute-1 podman[204874]: 2026-01-26 15:06:45.927460369 +0000 UTC m=+0.206549307 container remove ab68bfceb855bafb798c7cc0e0874470d9f9a0d93ff542d6fe235f3154507282 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-32ec1104-f9cc-4957-b081-02d91f2430ae, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 26 15:06:45 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:45.933 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[9e905434-aee3-42a2-a68d-70e12d1dbf7d]: (4, ("Mon Jan 26 03:06:45 PM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-32ec1104-f9cc-4957-b081-02d91f2430ae (ab68bfceb855bafb798c7cc0e0874470d9f9a0d93ff542d6fe235f3154507282)\nab68bfceb855bafb798c7cc0e0874470d9f9a0d93ff542d6fe235f3154507282\nMon Jan 26 03:06:45 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-32ec1104-f9cc-4957-b081-02d91f2430ae (ab68bfceb855bafb798c7cc0e0874470d9f9a0d93ff542d6fe235f3154507282)\nab68bfceb855bafb798c7cc0e0874470d9f9a0d93ff542d6fe235f3154507282\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:06:45 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:45.935 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[39f6935b-bded-44a2-8e27-1f29cd5210b9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:06:45 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:45.936 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/32ec1104-f9cc-4957-b081-02d91f2430ae.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/32ec1104-f9cc-4957-b081-02d91f2430ae.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:06:45 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:45.936 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[7d27fd4f-43bf-49c4-aaf1-13fbf77ef707]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:06:45 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:45.937 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap32ec1104-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:06:45 compute-1 kernel: tap32ec1104-f0: left promiscuous mode
Jan 26 15:06:45 compute-1 nova_compute[183403]: 2026-01-26 15:06:45.939 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:06:45 compute-1 nova_compute[183403]: 2026-01-26 15:06:45.957 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:06:45 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:45.960 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[4dfe31e5-382a-4b23-a2bc-c8fe72e1266d]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:06:45 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:45.976 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[8539a682-6693-43a7-a32d-28c5a8495e58]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:06:45 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:45.977 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[119c8df5-6cc5-4227-a4d0-14898cd43410]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:06:46 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:46.000 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[5f48160d-111f-4e40-b947-a2825edf7560]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372101, 'reachable_time': 21650, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 204898, 'error': None, 'target': 'ovnmeta-32ec1104-f9cc-4957-b081-02d91f2430ae', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:06:46 compute-1 systemd[1]: run-netns-ovnmeta\x2d32ec1104\x2df9cc\x2d4957\x2db081\x2d02d91f2430ae.mount: Deactivated successfully.
Jan 26 15:06:46 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:46.006 105448 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-32ec1104-f9cc-4957-b081-02d91f2430ae deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Jan 26 15:06:46 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:46.008 105448 DEBUG oslo.privsep.daemon [-] privsep: reply[89dd258f-d4f1-4443-bc67-a7c03f119022]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:06:46 compute-1 nova_compute[183403]: 2026-01-26 15:06:46.089 183407 DEBUG nova.virt.libvirt.vif [None req-9bbecb5d-5b47-4e41-b217-7c56446b05dc eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2026-01-26T15:06:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestDataModel-server-2084520715',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testdatamodel-server-2084520715',id=3,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T15:06:38Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='84dcf9273f5342fea8d5f6c33adc15c6',ramdisk_id='',reservation_id='r-50punrgc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestDataModel-285827473',owner_user_name='tempest-TestDataModel-285827473-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T15:06:38Z,user_data=None,user_id='eabbaf52ffab409ca33b5568f1dc327f',uuid=99dec4e9-b3d8-43a5-ac11-01f6490a6d99,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5e99f9d9-3c39-42b0-a69f-a83f25565205", "address": "fa:16:3e:4c:09:cb", "network": {"id": "32ec1104-f9cc-4957-b081-02d91f2430ae", "bridge": "br-int", "label": "tempest-TestDataModel-546723541-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3dd6a2d8596443749a561af56bd6c6e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e99f9d9-3c", "ovs_interfaceid": "5e99f9d9-3c39-42b0-a69f-a83f25565205", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 26 15:06:46 compute-1 nova_compute[183403]: 2026-01-26 15:06:46.089 183407 DEBUG nova.network.os_vif_util [None req-9bbecb5d-5b47-4e41-b217-7c56446b05dc eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Converting VIF {"id": "5e99f9d9-3c39-42b0-a69f-a83f25565205", "address": "fa:16:3e:4c:09:cb", "network": {"id": "32ec1104-f9cc-4957-b081-02d91f2430ae", "bridge": "br-int", "label": "tempest-TestDataModel-546723541-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3dd6a2d8596443749a561af56bd6c6e6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e99f9d9-3c", "ovs_interfaceid": "5e99f9d9-3c39-42b0-a69f-a83f25565205", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:06:46 compute-1 nova_compute[183403]: 2026-01-26 15:06:46.090 183407 DEBUG nova.network.os_vif_util [None req-9bbecb5d-5b47-4e41-b217-7c56446b05dc eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4c:09:cb,bridge_name='br-int',has_traffic_filtering=True,id=5e99f9d9-3c39-42b0-a69f-a83f25565205,network=Network(32ec1104-f9cc-4957-b081-02d91f2430ae),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5e99f9d9-3c') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:06:46 compute-1 nova_compute[183403]: 2026-01-26 15:06:46.091 183407 DEBUG os_vif [None req-9bbecb5d-5b47-4e41-b217-7c56446b05dc eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4c:09:cb,bridge_name='br-int',has_traffic_filtering=True,id=5e99f9d9-3c39-42b0-a69f-a83f25565205,network=Network(32ec1104-f9cc-4957-b081-02d91f2430ae),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5e99f9d9-3c') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 26 15:06:46 compute-1 nova_compute[183403]: 2026-01-26 15:06:46.094 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:06:46 compute-1 nova_compute[183403]: 2026-01-26 15:06:46.095 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5e99f9d9-3c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:06:46 compute-1 nova_compute[183403]: 2026-01-26 15:06:46.097 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:06:46 compute-1 nova_compute[183403]: 2026-01-26 15:06:46.098 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:06:46 compute-1 nova_compute[183403]: 2026-01-26 15:06:46.100 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:06:46 compute-1 nova_compute[183403]: 2026-01-26 15:06:46.102 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=e13c792c-797c-43ce-9c6e-e86f75b1d6cf) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:06:46 compute-1 nova_compute[183403]: 2026-01-26 15:06:46.104 183407 DEBUG nova.compute.manager [req-7e0bc2d4-a3e5-4578-af9d-e2e9a60f0107 req-5654aa23-e444-4544-a254-37193423a87c 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] Received event network-vif-unplugged-5e99f9d9-3c39-42b0-a69f-a83f25565205 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:06:46 compute-1 nova_compute[183403]: 2026-01-26 15:06:46.104 183407 DEBUG oslo_concurrency.lockutils [req-7e0bc2d4-a3e5-4578-af9d-e2e9a60f0107 req-5654aa23-e444-4544-a254-37193423a87c 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "99dec4e9-b3d8-43a5-ac11-01f6490a6d99-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:06:46 compute-1 nova_compute[183403]: 2026-01-26 15:06:46.105 183407 DEBUG oslo_concurrency.lockutils [req-7e0bc2d4-a3e5-4578-af9d-e2e9a60f0107 req-5654aa23-e444-4544-a254-37193423a87c 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "99dec4e9-b3d8-43a5-ac11-01f6490a6d99-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:06:46 compute-1 nova_compute[183403]: 2026-01-26 15:06:46.105 183407 DEBUG oslo_concurrency.lockutils [req-7e0bc2d4-a3e5-4578-af9d-e2e9a60f0107 req-5654aa23-e444-4544-a254-37193423a87c 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "99dec4e9-b3d8-43a5-ac11-01f6490a6d99-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:06:46 compute-1 nova_compute[183403]: 2026-01-26 15:06:46.105 183407 DEBUG nova.compute.manager [req-7e0bc2d4-a3e5-4578-af9d-e2e9a60f0107 req-5654aa23-e444-4544-a254-37193423a87c 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] No waiting events found dispatching network-vif-unplugged-5e99f9d9-3c39-42b0-a69f-a83f25565205 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:06:46 compute-1 nova_compute[183403]: 2026-01-26 15:06:46.106 183407 DEBUG nova.compute.manager [req-7e0bc2d4-a3e5-4578-af9d-e2e9a60f0107 req-5654aa23-e444-4544-a254-37193423a87c 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] Received event network-vif-unplugged-5e99f9d9-3c39-42b0-a69f-a83f25565205 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 15:06:46 compute-1 nova_compute[183403]: 2026-01-26 15:06:46.106 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:06:46 compute-1 nova_compute[183403]: 2026-01-26 15:06:46.107 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:06:46 compute-1 nova_compute[183403]: 2026-01-26 15:06:46.110 183407 INFO os_vif [None req-9bbecb5d-5b47-4e41-b217-7c56446b05dc eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4c:09:cb,bridge_name='br-int',has_traffic_filtering=True,id=5e99f9d9-3c39-42b0-a69f-a83f25565205,network=Network(32ec1104-f9cc-4957-b081-02d91f2430ae),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5e99f9d9-3c')
Jan 26 15:06:46 compute-1 nova_compute[183403]: 2026-01-26 15:06:46.111 183407 INFO nova.virt.libvirt.driver [None req-9bbecb5d-5b47-4e41-b217-7c56446b05dc eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] Deleting instance files /var/lib/nova/instances/99dec4e9-b3d8-43a5-ac11-01f6490a6d99_del
Jan 26 15:06:46 compute-1 nova_compute[183403]: 2026-01-26 15:06:46.112 183407 INFO nova.virt.libvirt.driver [None req-9bbecb5d-5b47-4e41-b217-7c56446b05dc eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] Deletion of /var/lib/nova/instances/99dec4e9-b3d8-43a5-ac11-01f6490a6d99_del complete
Jan 26 15:06:46 compute-1 nova_compute[183403]: 2026-01-26 15:06:46.629 183407 INFO nova.compute.manager [None req-9bbecb5d-5b47-4e41-b217-7c56446b05dc eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] Took 1.31 seconds to destroy the instance on the hypervisor.
Jan 26 15:06:46 compute-1 nova_compute[183403]: 2026-01-26 15:06:46.630 183407 DEBUG oslo.service.backend._eventlet.loopingcall [None req-9bbecb5d-5b47-4e41-b217-7c56446b05dc eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Jan 26 15:06:46 compute-1 nova_compute[183403]: 2026-01-26 15:06:46.630 183407 DEBUG nova.compute.manager [-] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Jan 26 15:06:46 compute-1 nova_compute[183403]: 2026-01-26 15:06:46.630 183407 DEBUG nova.network.neutron [-] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Jan 26 15:06:46 compute-1 nova_compute[183403]: 2026-01-26 15:06:46.630 183407 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:06:47 compute-1 nova_compute[183403]: 2026-01-26 15:06:47.010 183407 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:06:47 compute-1 nova_compute[183403]: 2026-01-26 15:06:47.818 183407 DEBUG nova.network.neutron [-] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:06:48 compute-1 nova_compute[183403]: 2026-01-26 15:06:48.237 183407 DEBUG nova.compute.manager [req-96cd4986-d5cc-4eee-8613-6d301e4c7d65 req-4888dc9e-d486-41a1-8213-bdf780f75176 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] Received event network-vif-unplugged-5e99f9d9-3c39-42b0-a69f-a83f25565205 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:06:48 compute-1 nova_compute[183403]: 2026-01-26 15:06:48.238 183407 DEBUG oslo_concurrency.lockutils [req-96cd4986-d5cc-4eee-8613-6d301e4c7d65 req-4888dc9e-d486-41a1-8213-bdf780f75176 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "99dec4e9-b3d8-43a5-ac11-01f6490a6d99-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:06:48 compute-1 nova_compute[183403]: 2026-01-26 15:06:48.238 183407 DEBUG oslo_concurrency.lockutils [req-96cd4986-d5cc-4eee-8613-6d301e4c7d65 req-4888dc9e-d486-41a1-8213-bdf780f75176 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "99dec4e9-b3d8-43a5-ac11-01f6490a6d99-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:06:48 compute-1 nova_compute[183403]: 2026-01-26 15:06:48.239 183407 DEBUG oslo_concurrency.lockutils [req-96cd4986-d5cc-4eee-8613-6d301e4c7d65 req-4888dc9e-d486-41a1-8213-bdf780f75176 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "99dec4e9-b3d8-43a5-ac11-01f6490a6d99-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:06:48 compute-1 nova_compute[183403]: 2026-01-26 15:06:48.239 183407 DEBUG nova.compute.manager [req-96cd4986-d5cc-4eee-8613-6d301e4c7d65 req-4888dc9e-d486-41a1-8213-bdf780f75176 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] No waiting events found dispatching network-vif-unplugged-5e99f9d9-3c39-42b0-a69f-a83f25565205 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:06:48 compute-1 nova_compute[183403]: 2026-01-26 15:06:48.239 183407 DEBUG nova.compute.manager [req-96cd4986-d5cc-4eee-8613-6d301e4c7d65 req-4888dc9e-d486-41a1-8213-bdf780f75176 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] Received event network-vif-unplugged-5e99f9d9-3c39-42b0-a69f-a83f25565205 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 15:06:48 compute-1 nova_compute[183403]: 2026-01-26 15:06:48.240 183407 DEBUG nova.compute.manager [req-96cd4986-d5cc-4eee-8613-6d301e4c7d65 req-4888dc9e-d486-41a1-8213-bdf780f75176 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] Received event network-vif-deleted-5e99f9d9-3c39-42b0-a69f-a83f25565205 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:06:48 compute-1 nova_compute[183403]: 2026-01-26 15:06:48.316 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:06:48 compute-1 nova_compute[183403]: 2026-01-26 15:06:48.358 183407 INFO nova.compute.manager [-] [instance: 99dec4e9-b3d8-43a5-ac11-01f6490a6d99] Took 1.73 seconds to deallocate network for instance.
Jan 26 15:06:48 compute-1 nova_compute[183403]: 2026-01-26 15:06:48.915 183407 DEBUG oslo_concurrency.lockutils [None req-9bbecb5d-5b47-4e41-b217-7c56446b05dc eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:06:48 compute-1 nova_compute[183403]: 2026-01-26 15:06:48.916 183407 DEBUG oslo_concurrency.lockutils [None req-9bbecb5d-5b47-4e41-b217-7c56446b05dc eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:06:48 compute-1 nova_compute[183403]: 2026-01-26 15:06:48.974 183407 DEBUG nova.compute.provider_tree [None req-9bbecb5d-5b47-4e41-b217-7c56446b05dc eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Updating inventory in ProviderTree for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Jan 26 15:06:49 compute-1 openstack_network_exporter[195610]: ERROR   15:06:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:06:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:06:49 compute-1 openstack_network_exporter[195610]: ERROR   15:06:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:06:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:06:49 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:06:49.469 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=41380b5a-e321-4ce4-bcc6-ecd563b3c793, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:06:49 compute-1 nova_compute[183403]: 2026-01-26 15:06:49.498 183407 ERROR nova.scheduler.client.report [None req-9bbecb5d-5b47-4e41-b217-7c56446b05dc eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] [req-ac3bc266-166c-4fac-926a-4f26fc77bffa] Failed to update inventory to [{'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID e3eb07a3-6ab4-4f51-ad76-347430ed2b67.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-ac3bc266-166c-4fac-926a-4f26fc77bffa"}]}
Jan 26 15:06:49 compute-1 nova_compute[183403]: 2026-01-26 15:06:49.515 183407 DEBUG nova.scheduler.client.report [None req-9bbecb5d-5b47-4e41-b217-7c56446b05dc eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Refreshing inventories for resource provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Jan 26 15:06:49 compute-1 nova_compute[183403]: 2026-01-26 15:06:49.529 183407 DEBUG nova.scheduler.client.report [None req-9bbecb5d-5b47-4e41-b217-7c56446b05dc eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Updating ProviderTree inventory for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Jan 26 15:06:49 compute-1 nova_compute[183403]: 2026-01-26 15:06:49.529 183407 DEBUG nova.compute.provider_tree [None req-9bbecb5d-5b47-4e41-b217-7c56446b05dc eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Updating inventory in ProviderTree for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Jan 26 15:06:49 compute-1 nova_compute[183403]: 2026-01-26 15:06:49.544 183407 DEBUG nova.scheduler.client.report [None req-9bbecb5d-5b47-4e41-b217-7c56446b05dc eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Refreshing aggregate associations for resource provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Jan 26 15:06:49 compute-1 nova_compute[183403]: 2026-01-26 15:06:49.564 183407 DEBUG nova.scheduler.client.report [None req-9bbecb5d-5b47-4e41-b217-7c56446b05dc eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Refreshing trait associations for resource provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67, traits: COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE41,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NODE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_MMX,COMPUTE_ACCELERATORS,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_ARCH_X86_64,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE2,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_SOUND_MODEL_USB,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_TIS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_DEVICE_TAGGING,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_CRB,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_SOUND_MODEL_SB16,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_QCOW2,HW_ARCH_X86_64,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SECURITY_TPM_2_0 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Jan 26 15:06:49 compute-1 nova_compute[183403]: 2026-01-26 15:06:49.600 183407 DEBUG nova.compute.provider_tree [None req-9bbecb5d-5b47-4e41-b217-7c56446b05dc eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Updating inventory in ProviderTree for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Jan 26 15:06:50 compute-1 nova_compute[183403]: 2026-01-26 15:06:50.156 183407 DEBUG nova.scheduler.client.report [None req-9bbecb5d-5b47-4e41-b217-7c56446b05dc eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Updated inventory for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:975
Jan 26 15:06:50 compute-1 nova_compute[183403]: 2026-01-26 15:06:50.157 183407 DEBUG nova.compute.provider_tree [None req-9bbecb5d-5b47-4e41-b217-7c56446b05dc eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Updating resource provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Jan 26 15:06:50 compute-1 nova_compute[183403]: 2026-01-26 15:06:50.157 183407 DEBUG nova.compute.provider_tree [None req-9bbecb5d-5b47-4e41-b217-7c56446b05dc eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Updating inventory in ProviderTree for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Jan 26 15:06:50 compute-1 sshd-session[204806]: Invalid user huawei from 185.246.128.170 port 50384
Jan 26 15:06:50 compute-1 nova_compute[183403]: 2026-01-26 15:06:50.666 183407 DEBUG oslo_concurrency.lockutils [None req-9bbecb5d-5b47-4e41-b217-7c56446b05dc eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.750s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:06:50 compute-1 nova_compute[183403]: 2026-01-26 15:06:50.740 183407 INFO nova.scheduler.client.report [None req-9bbecb5d-5b47-4e41-b217-7c56446b05dc eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Deleted allocations for instance 99dec4e9-b3d8-43a5-ac11-01f6490a6d99
Jan 26 15:06:51 compute-1 nova_compute[183403]: 2026-01-26 15:06:51.104 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:06:51 compute-1 nova_compute[183403]: 2026-01-26 15:06:51.782 183407 DEBUG oslo_concurrency.lockutils [None req-9bbecb5d-5b47-4e41-b217-7c56446b05dc eabbaf52ffab409ca33b5568f1dc327f 84dcf9273f5342fea8d5f6c33adc15c6 - - default default] Lock "99dec4e9-b3d8-43a5-ac11-01f6490a6d99" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.994s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:06:52 compute-1 podman[204901]: 2026-01-26 15:06:52.927422216 +0000 UTC m=+0.047581280 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 26 15:06:52 compute-1 podman[204900]: 2026-01-26 15:06:52.930525465 +0000 UTC m=+0.100899544 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 26 15:06:53 compute-1 nova_compute[183403]: 2026-01-26 15:06:53.319 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:06:53 compute-1 sshd-session[204806]: Disconnecting invalid user huawei 185.246.128.170 port 50384: Change of username or service not allowed: (huawei,ssh-connection) -> (orangepi,ssh-connection) [preauth]
Jan 26 15:06:55 compute-1 nova_compute[183403]: 2026-01-26 15:06:55.189 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:06:56 compute-1 nova_compute[183403]: 2026-01-26 15:06:56.105 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:06:56 compute-1 nova_compute[183403]: 2026-01-26 15:06:56.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:06:57 compute-1 nova_compute[183403]: 2026-01-26 15:06:57.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:06:58 compute-1 nova_compute[183403]: 2026-01-26 15:06:58.092 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:06:58 compute-1 nova_compute[183403]: 2026-01-26 15:06:58.092 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:06:58 compute-1 nova_compute[183403]: 2026-01-26 15:06:58.092 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:06:58 compute-1 nova_compute[183403]: 2026-01-26 15:06:58.093 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:06:58 compute-1 nova_compute[183403]: 2026-01-26 15:06:58.271 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:06:58 compute-1 nova_compute[183403]: 2026-01-26 15:06:58.273 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:06:58 compute-1 nova_compute[183403]: 2026-01-26 15:06:58.299 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.026s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:06:58 compute-1 nova_compute[183403]: 2026-01-26 15:06:58.299 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5816MB free_disk=73.149169921875GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:06:58 compute-1 nova_compute[183403]: 2026-01-26 15:06:58.300 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:06:58 compute-1 nova_compute[183403]: 2026-01-26 15:06:58.300 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:06:58 compute-1 nova_compute[183403]: 2026-01-26 15:06:58.321 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:06:59 compute-1 nova_compute[183403]: 2026-01-26 15:06:59.629 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:06:59 compute-1 nova_compute[183403]: 2026-01-26 15:06:59.630 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:06:58 up  1:02,  0 user,  load average: 0.31, 0.21, 0.42\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:06:59 compute-1 nova_compute[183403]: 2026-01-26 15:06:59.779 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:07:00 compute-1 nova_compute[183403]: 2026-01-26 15:07:00.526 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:07:01 compute-1 nova_compute[183403]: 2026-01-26 15:07:01.075 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:07:01 compute-1 nova_compute[183403]: 2026-01-26 15:07:01.076 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.776s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:07:01 compute-1 nova_compute[183403]: 2026-01-26 15:07:01.107 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:07:02 compute-1 nova_compute[183403]: 2026-01-26 15:07:02.076 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:07:02 compute-1 nova_compute[183403]: 2026-01-26 15:07:02.077 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:07:02 compute-1 nova_compute[183403]: 2026-01-26 15:07:02.077 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:07:02 compute-1 nova_compute[183403]: 2026-01-26 15:07:02.077 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 15:07:02 compute-1 sshd-session[204948]: Invalid user orangepi from 185.246.128.170 port 39110
Jan 26 15:07:03 compute-1 sshd-session[204948]: Disconnecting invalid user orangepi 185.246.128.170 port 39110: Change of username or service not allowed: (orangepi,ssh-connection) -> (peter,ssh-connection) [preauth]
Jan 26 15:07:03 compute-1 nova_compute[183403]: 2026-01-26 15:07:03.324 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:07:04 compute-1 nova_compute[183403]: 2026-01-26 15:07:04.577 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:07:05 compute-1 podman[192725]: time="2026-01-26T15:07:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:07:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:07:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:07:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:07:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2163 "" "Go-http-client/1.1"
Jan 26 15:07:06 compute-1 nova_compute[183403]: 2026-01-26 15:07:06.110 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:07:06 compute-1 nova_compute[183403]: 2026-01-26 15:07:06.572 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:07:08 compute-1 nova_compute[183403]: 2026-01-26 15:07:08.327 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:07:11 compute-1 nova_compute[183403]: 2026-01-26 15:07:11.113 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:07:11 compute-1 podman[204955]: 2026-01-26 15:07:11.896573035 +0000 UTC m=+0.074505455 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, container_name=openstack_network_exporter, version=9.6, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., release=1755695350, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_id=openstack_network_exporter)
Jan 26 15:07:11 compute-1 podman[204954]: 2026-01-26 15:07:11.904471105 +0000 UTC m=+0.086910196 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 15:07:13 compute-1 nova_compute[183403]: 2026-01-26 15:07:13.328 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:07:13 compute-1 sshd-session[204952]: Invalid user peter from 185.246.128.170 port 37015
Jan 26 15:07:14 compute-1 sshd-session[204952]: Disconnecting invalid user peter 185.246.128.170 port 37015: Change of username or service not allowed: (peter,ssh-connection) -> (odoo,ssh-connection) [preauth]
Jan 26 15:07:16 compute-1 nova_compute[183403]: 2026-01-26 15:07:16.114 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:07:18 compute-1 nova_compute[183403]: 2026-01-26 15:07:18.332 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:07:19 compute-1 openstack_network_exporter[195610]: ERROR   15:07:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:07:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:07:19 compute-1 openstack_network_exporter[195610]: ERROR   15:07:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:07:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:07:21 compute-1 nova_compute[183403]: 2026-01-26 15:07:21.116 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:07:21 compute-1 sshd-session[204998]: Invalid user odoo from 185.246.128.170 port 42779
Jan 26 15:07:21 compute-1 sshd-session[204998]: Disconnecting invalid user odoo 185.246.128.170 port 42779: Change of username or service not allowed: (odoo,ssh-connection) -> (richard,ssh-connection) [preauth]
Jan 26 15:07:23 compute-1 nova_compute[183403]: 2026-01-26 15:07:23.333 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:07:23 compute-1 podman[205002]: 2026-01-26 15:07:23.914436346 +0000 UTC m=+0.084256878 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 26 15:07:23 compute-1 podman[205001]: 2026-01-26 15:07:23.941561526 +0000 UTC m=+0.118432144 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 26 15:07:26 compute-1 nova_compute[183403]: 2026-01-26 15:07:26.118 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:07:28 compute-1 nova_compute[183403]: 2026-01-26 15:07:28.336 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:07:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:07:29.025 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:07:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:07:29.026 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:07:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:07:29.026 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:07:31 compute-1 nova_compute[183403]: 2026-01-26 15:07:31.120 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:07:33 compute-1 nova_compute[183403]: 2026-01-26 15:07:33.334 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:07:33 compute-1 nova_compute[183403]: 2026-01-26 15:07:33.337 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:07:34 compute-1 sshd-session[205000]: Invalid user richard from 185.246.128.170 port 57397
Jan 26 15:07:35 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:07:35.318 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:ac:38', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'de:96:40:90:f3:e5'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:07:35 compute-1 nova_compute[183403]: 2026-01-26 15:07:35.319 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:07:35 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:07:35.319 104930 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 15:07:35 compute-1 podman[192725]: time="2026-01-26T15:07:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:07:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:07:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:07:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:07:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2162 "" "Go-http-client/1.1"
Jan 26 15:07:36 compute-1 nova_compute[183403]: 2026-01-26 15:07:36.122 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:07:38 compute-1 nova_compute[183403]: 2026-01-26 15:07:38.339 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:07:38 compute-1 sshd-session[205000]: Disconnecting invalid user richard 185.246.128.170 port 57397: Change of username or service not allowed: (richard,ssh-connection) -> (alex,ssh-connection) [preauth]
Jan 26 15:07:41 compute-1 nova_compute[183403]: 2026-01-26 15:07:41.124 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:07:42 compute-1 podman[205051]: 2026-01-26 15:07:42.872218813 +0000 UTC m=+0.052310210 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 15:07:42 compute-1 podman[205052]: 2026-01-26 15:07:42.886044104 +0000 UTC m=+0.060402929 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, architecture=x86_64, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, name=ubi9-minimal, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter)
Jan 26 15:07:43 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:07:43.320 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=41380b5a-e321-4ce4-bcc6-ecd563b3c793, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:07:43 compute-1 nova_compute[183403]: 2026-01-26 15:07:43.341 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:07:46 compute-1 nova_compute[183403]: 2026-01-26 15:07:46.126 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:07:46 compute-1 sshd-session[205096]: Invalid user alex from 185.246.128.170 port 43696
Jan 26 15:07:46 compute-1 sshd-session[205096]: Disconnecting invalid user alex 185.246.128.170 port 43696: Change of username or service not allowed: (alex,ssh-connection) -> (administrator,ssh-connection) [preauth]
Jan 26 15:07:46 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:07:46.741 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:53:55:45 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-d4a37c9f-5b64-4f94-80e9-126c911b1acf', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d4a37c9f-5b64-4f94-80e9-126c911b1acf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7dc5e9070a084dfcb543a08e87868f39', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=922d1c2a-bc46-47ee-81d5-242719303ef7, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=3415b7f1-5b64-48d1-b20f-4c68422efc0e) old=Port_Binding(mac=['fa:16:3e:53:55:45'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-d4a37c9f-5b64-4f94-80e9-126c911b1acf', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d4a37c9f-5b64-4f94-80e9-126c911b1acf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7dc5e9070a084dfcb543a08e87868f39', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:07:46 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:07:46.742 104930 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 3415b7f1-5b64-48d1-b20f-4c68422efc0e in datapath d4a37c9f-5b64-4f94-80e9-126c911b1acf updated
Jan 26 15:07:46 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:07:46.743 104930 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d4a37c9f-5b64-4f94-80e9-126c911b1acf, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 15:07:46 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:07:46.745 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[150cbc7c-1b1d-465f-b808-d0980ba8ab0b]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:07:48 compute-1 nova_compute[183403]: 2026-01-26 15:07:48.342 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:07:49 compute-1 openstack_network_exporter[195610]: ERROR   15:07:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:07:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:07:49 compute-1 openstack_network_exporter[195610]: ERROR   15:07:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:07:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:07:51 compute-1 nova_compute[183403]: 2026-01-26 15:07:51.127 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:07:52 compute-1 sshd-session[205099]: Invalid user administrator from 185.246.128.170 port 37367
Jan 26 15:07:53 compute-1 nova_compute[183403]: 2026-01-26 15:07:53.345 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:07:53 compute-1 sshd-session[205099]: Disconnecting invalid user administrator 185.246.128.170 port 37367: Change of username or service not allowed: (administrator,ssh-connection) -> (csgo,ssh-connection) [preauth]
Jan 26 15:07:53 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:07:53.636 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ec:18:f8 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-42de26f4-d2b4-46ee-b6c8-855a78698e6a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-42de26f4-d2b4-46ee-b6c8-855a78698e6a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6377892a338d4a7cbe63cf30bd2c63ea', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=53799e06-a434-4c4a-ad1a-d8c3ad452e8f, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=786d9e29-ab15-4fa1-8fe4-6571c2e0211c) old=Port_Binding(mac=['fa:16:3e:ec:18:f8'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-42de26f4-d2b4-46ee-b6c8-855a78698e6a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-42de26f4-d2b4-46ee-b6c8-855a78698e6a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6377892a338d4a7cbe63cf30bd2c63ea', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:07:53 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:07:53.636 104930 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 786d9e29-ab15-4fa1-8fe4-6571c2e0211c in datapath 42de26f4-d2b4-46ee-b6c8-855a78698e6a updated
Jan 26 15:07:53 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:07:53.637 104930 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 42de26f4-d2b4-46ee-b6c8-855a78698e6a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 15:07:53 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:07:53.637 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[88b85497-57e4-4255-be73-dba5f64331fb]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:07:54 compute-1 podman[205104]: 2026-01-26 15:07:54.869311261 +0000 UTC m=+0.046929074 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, managed_by=edpm_ansible)
Jan 26 15:07:54 compute-1 podman[205103]: 2026-01-26 15:07:54.897158242 +0000 UTC m=+0.078544056 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 15:07:54 compute-1 sshd-session[205101]: Invalid user csgo from 185.246.128.170 port 24206
Jan 26 15:07:55 compute-1 sshd-session[205101]: Disconnecting invalid user csgo 185.246.128.170 port 24206: Change of username or service not allowed: (csgo,ssh-connection) -> (ftp1,ssh-connection) [preauth]
Jan 26 15:07:56 compute-1 nova_compute[183403]: 2026-01-26 15:07:56.129 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:07:56 compute-1 nova_compute[183403]: 2026-01-26 15:07:56.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:07:58 compute-1 nova_compute[183403]: 2026-01-26 15:07:58.347 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:07:58 compute-1 nova_compute[183403]: 2026-01-26 15:07:58.575 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:07:58 compute-1 nova_compute[183403]: 2026-01-26 15:07:58.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:07:58 compute-1 nova_compute[183403]: 2026-01-26 15:07:58.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:07:59 compute-1 nova_compute[183403]: 2026-01-26 15:07:59.092 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:07:59 compute-1 nova_compute[183403]: 2026-01-26 15:07:59.093 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:07:59 compute-1 nova_compute[183403]: 2026-01-26 15:07:59.093 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:07:59 compute-1 nova_compute[183403]: 2026-01-26 15:07:59.093 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:07:59 compute-1 nova_compute[183403]: 2026-01-26 15:07:59.218 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:07:59 compute-1 nova_compute[183403]: 2026-01-26 15:07:59.219 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:07:59 compute-1 nova_compute[183403]: 2026-01-26 15:07:59.236 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:07:59 compute-1 nova_compute[183403]: 2026-01-26 15:07:59.236 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5879MB free_disk=73.1491470336914GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:07:59 compute-1 nova_compute[183403]: 2026-01-26 15:07:59.237 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:07:59 compute-1 nova_compute[183403]: 2026-01-26 15:07:59.237 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:07:59 compute-1 sshd-session[205150]: Invalid user ftp1 from 185.246.128.170 port 33088
Jan 26 15:08:00 compute-1 nova_compute[183403]: 2026-01-26 15:08:00.412 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:08:00 compute-1 nova_compute[183403]: 2026-01-26 15:08:00.413 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:07:59 up  1:03,  0 user,  load average: 0.11, 0.17, 0.39\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:08:00 compute-1 nova_compute[183403]: 2026-01-26 15:08:00.439 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:08:00 compute-1 sshd-session[205150]: Disconnecting invalid user ftp1 185.246.128.170 port 33088: Change of username or service not allowed: (ftp1,ssh-connection) -> (publicuser,ssh-connection) [preauth]
Jan 26 15:08:00 compute-1 nova_compute[183403]: 2026-01-26 15:08:00.947 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:08:01 compute-1 nova_compute[183403]: 2026-01-26 15:08:01.131 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:08:01 compute-1 nova_compute[183403]: 2026-01-26 15:08:01.463 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:08:01 compute-1 nova_compute[183403]: 2026-01-26 15:08:01.464 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.226s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:08:03 compute-1 nova_compute[183403]: 2026-01-26 15:08:03.348 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:08:03 compute-1 nova_compute[183403]: 2026-01-26 15:08:03.459 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:08:03 compute-1 nova_compute[183403]: 2026-01-26 15:08:03.977 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:08:03 compute-1 nova_compute[183403]: 2026-01-26 15:08:03.978 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:08:03 compute-1 nova_compute[183403]: 2026-01-26 15:08:03.978 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 15:08:05 compute-1 nova_compute[183403]: 2026-01-26 15:08:05.575 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:08:05 compute-1 podman[192725]: time="2026-01-26T15:08:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:08:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:08:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:08:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:08:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2162 "" "Go-http-client/1.1"
Jan 26 15:08:06 compute-1 nova_compute[183403]: 2026-01-26 15:08:06.133 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:08:07 compute-1 nova_compute[183403]: 2026-01-26 15:08:07.571 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:08:07 compute-1 ovn_controller[95641]: 2026-01-26T15:08:07Z|00048|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Jan 26 15:08:08 compute-1 sshd-session[205153]: Invalid user publicuser from 185.246.128.170 port 40324
Jan 26 15:08:08 compute-1 nova_compute[183403]: 2026-01-26 15:08:08.350 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:08:09 compute-1 sshd-session[205153]: Disconnecting invalid user publicuser 185.246.128.170 port 40324: Change of username or service not allowed: (publicuser,ssh-connection) -> (ddd,ssh-connection) [preauth]
Jan 26 15:08:11 compute-1 nova_compute[183403]: 2026-01-26 15:08:11.135 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:08:13 compute-1 nova_compute[183403]: 2026-01-26 15:08:13.352 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:08:13 compute-1 podman[205156]: 2026-01-26 15:08:13.872182001 +0000 UTC m=+0.054553042 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, distribution-scope=public, vcs-type=git, version=9.6, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, io.buildah.version=1.33.7, io.openshift.expose-services=, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc.)
Jan 26 15:08:13 compute-1 podman[205155]: 2026-01-26 15:08:13.888050715 +0000 UTC m=+0.074067023 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 26 15:08:16 compute-1 nova_compute[183403]: 2026-01-26 15:08:16.136 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:08:18 compute-1 nova_compute[183403]: 2026-01-26 15:08:18.354 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:08:19 compute-1 openstack_network_exporter[195610]: ERROR   15:08:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:08:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:08:19 compute-1 openstack_network_exporter[195610]: ERROR   15:08:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:08:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:08:19 compute-1 sshd-session[205200]: Invalid user ddd from 185.246.128.170 port 63403
Jan 26 15:08:20 compute-1 sshd-session[205200]: Disconnecting invalid user ddd 185.246.128.170 port 63403: Change of username or service not allowed: (ddd,ssh-connection) -> (manager,ssh-connection) [preauth]
Jan 26 15:08:21 compute-1 nova_compute[183403]: 2026-01-26 15:08:21.137 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:08:23 compute-1 nova_compute[183403]: 2026-01-26 15:08:23.355 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:08:25 compute-1 podman[205205]: 2026-01-26 15:08:25.888440832 +0000 UTC m=+0.064033664 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest)
Jan 26 15:08:25 compute-1 podman[205204]: 2026-01-26 15:08:25.932682401 +0000 UTC m=+0.110988497 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 15:08:26 compute-1 nova_compute[183403]: 2026-01-26 15:08:26.138 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:08:28 compute-1 nova_compute[183403]: 2026-01-26 15:08:28.358 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:08:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:08:29.027 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:08:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:08:29.027 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:08:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:08:29.027 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:08:31 compute-1 nova_compute[183403]: 2026-01-26 15:08:31.140 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:08:33 compute-1 nova_compute[183403]: 2026-01-26 15:08:33.360 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:08:35 compute-1 sshd-session[205202]: Invalid user manager from 185.246.128.170 port 18164
Jan 26 15:08:35 compute-1 podman[192725]: time="2026-01-26T15:08:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:08:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:08:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:08:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:08:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2168 "" "Go-http-client/1.1"
Jan 26 15:08:35 compute-1 sshd-session[205202]: Disconnecting invalid user manager 185.246.128.170 port 18164: Change of username or service not allowed: (manager,ssh-connection) -> (scsadmin,ssh-connection) [preauth]
Jan 26 15:08:35 compute-1 nova_compute[183403]: 2026-01-26 15:08:35.990 183407 DEBUG oslo_concurrency.lockutils [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Acquiring lock "66a7af21-1abe-467f-b739-441e05a4b09a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:08:35 compute-1 nova_compute[183403]: 2026-01-26 15:08:35.990 183407 DEBUG oslo_concurrency.lockutils [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "66a7af21-1abe-467f-b739-441e05a4b09a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:08:36 compute-1 nova_compute[183403]: 2026-01-26 15:08:36.142 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:08:36 compute-1 nova_compute[183403]: 2026-01-26 15:08:36.496 183407 DEBUG nova.compute.manager [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Jan 26 15:08:37 compute-1 nova_compute[183403]: 2026-01-26 15:08:37.045 183407 DEBUG oslo_concurrency.lockutils [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:08:37 compute-1 nova_compute[183403]: 2026-01-26 15:08:37.046 183407 DEBUG oslo_concurrency.lockutils [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:08:37 compute-1 nova_compute[183403]: 2026-01-26 15:08:37.053 183407 DEBUG nova.virt.hardware [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Jan 26 15:08:37 compute-1 nova_compute[183403]: 2026-01-26 15:08:37.053 183407 INFO nova.compute.claims [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Claim successful on node compute-1.ctlplane.example.com
Jan 26 15:08:38 compute-1 sshd-session[205252]: Invalid user scsadmin from 185.246.128.170 port 47471
Jan 26 15:08:38 compute-1 nova_compute[183403]: 2026-01-26 15:08:38.100 183407 DEBUG nova.compute.provider_tree [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:08:38 compute-1 nova_compute[183403]: 2026-01-26 15:08:38.363 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:08:38 compute-1 nova_compute[183403]: 2026-01-26 15:08:38.606 183407 DEBUG nova.scheduler.client.report [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:08:39 compute-1 sshd-session[205252]: Disconnecting invalid user scsadmin 185.246.128.170 port 47471: Change of username or service not allowed: (scsadmin,ssh-connection) -> (sync,ssh-connection) [preauth]
Jan 26 15:08:39 compute-1 nova_compute[183403]: 2026-01-26 15:08:39.119 183407 DEBUG oslo_concurrency.lockutils [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.073s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:08:39 compute-1 nova_compute[183403]: 2026-01-26 15:08:39.120 183407 DEBUG nova.compute.manager [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Jan 26 15:08:39 compute-1 nova_compute[183403]: 2026-01-26 15:08:39.633 183407 DEBUG nova.compute.manager [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Jan 26 15:08:39 compute-1 nova_compute[183403]: 2026-01-26 15:08:39.634 183407 DEBUG nova.network.neutron [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Jan 26 15:08:39 compute-1 nova_compute[183403]: 2026-01-26 15:08:39.635 183407 WARNING neutronclient.v2_0.client [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:08:39 compute-1 nova_compute[183403]: 2026-01-26 15:08:39.636 183407 WARNING neutronclient.v2_0.client [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:08:40 compute-1 nova_compute[183403]: 2026-01-26 15:08:40.145 183407 INFO nova.virt.libvirt.driver [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 15:08:40 compute-1 nova_compute[183403]: 2026-01-26 15:08:40.654 183407 DEBUG nova.compute.manager [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Jan 26 15:08:41 compute-1 nova_compute[183403]: 2026-01-26 15:08:41.090 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:08:41 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:08:41.091 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:ac:38', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'de:96:40:90:f3:e5'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:08:41 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:08:41.092 104930 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 15:08:41 compute-1 nova_compute[183403]: 2026-01-26 15:08:41.145 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:08:41 compute-1 nova_compute[183403]: 2026-01-26 15:08:41.269 183407 DEBUG nova.network.neutron [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Successfully created port: 1fd7a551-45a6-412c-abb2-e2d57c2b25e8 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Jan 26 15:08:41 compute-1 nova_compute[183403]: 2026-01-26 15:08:41.672 183407 DEBUG nova.compute.manager [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Jan 26 15:08:41 compute-1 nova_compute[183403]: 2026-01-26 15:08:41.674 183407 DEBUG nova.virt.libvirt.driver [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Jan 26 15:08:41 compute-1 nova_compute[183403]: 2026-01-26 15:08:41.675 183407 INFO nova.virt.libvirt.driver [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Creating image(s)
Jan 26 15:08:41 compute-1 nova_compute[183403]: 2026-01-26 15:08:41.676 183407 DEBUG oslo_concurrency.lockutils [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Acquiring lock "/var/lib/nova/instances/66a7af21-1abe-467f-b739-441e05a4b09a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:08:41 compute-1 nova_compute[183403]: 2026-01-26 15:08:41.676 183407 DEBUG oslo_concurrency.lockutils [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "/var/lib/nova/instances/66a7af21-1abe-467f-b739-441e05a4b09a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:08:41 compute-1 nova_compute[183403]: 2026-01-26 15:08:41.678 183407 DEBUG oslo_concurrency.lockutils [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "/var/lib/nova/instances/66a7af21-1abe-467f-b739-441e05a4b09a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:08:41 compute-1 nova_compute[183403]: 2026-01-26 15:08:41.681 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:08:41 compute-1 nova_compute[183403]: 2026-01-26 15:08:41.688 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:08:41 compute-1 nova_compute[183403]: 2026-01-26 15:08:41.690 183407 DEBUG oslo_concurrency.processutils [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:08:41 compute-1 nova_compute[183403]: 2026-01-26 15:08:41.781 183407 DEBUG oslo_concurrency.processutils [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:08:41 compute-1 nova_compute[183403]: 2026-01-26 15:08:41.782 183407 DEBUG oslo_concurrency.lockutils [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Acquiring lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:08:41 compute-1 nova_compute[183403]: 2026-01-26 15:08:41.782 183407 DEBUG oslo_concurrency.lockutils [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:08:41 compute-1 nova_compute[183403]: 2026-01-26 15:08:41.783 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:08:41 compute-1 nova_compute[183403]: 2026-01-26 15:08:41.785 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:08:41 compute-1 nova_compute[183403]: 2026-01-26 15:08:41.786 183407 DEBUG oslo_concurrency.processutils [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:08:41 compute-1 nova_compute[183403]: 2026-01-26 15:08:41.859 183407 DEBUG oslo_concurrency.processutils [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:08:41 compute-1 nova_compute[183403]: 2026-01-26 15:08:41.860 183407 DEBUG oslo_concurrency.processutils [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0,backing_fmt=raw /var/lib/nova/instances/66a7af21-1abe-467f-b739-441e05a4b09a/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:08:41 compute-1 nova_compute[183403]: 2026-01-26 15:08:41.895 183407 DEBUG oslo_concurrency.processutils [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0,backing_fmt=raw /var/lib/nova/instances/66a7af21-1abe-467f-b739-441e05a4b09a/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:08:41 compute-1 nova_compute[183403]: 2026-01-26 15:08:41.896 183407 DEBUG oslo_concurrency.lockutils [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.114s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:08:41 compute-1 nova_compute[183403]: 2026-01-26 15:08:41.896 183407 DEBUG oslo_concurrency.processutils [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:08:41 compute-1 nova_compute[183403]: 2026-01-26 15:08:41.945 183407 DEBUG oslo_concurrency.processutils [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:08:41 compute-1 nova_compute[183403]: 2026-01-26 15:08:41.946 183407 DEBUG nova.virt.disk.api [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Checking if we can resize image /var/lib/nova/instances/66a7af21-1abe-467f-b739-441e05a4b09a/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 26 15:08:41 compute-1 nova_compute[183403]: 2026-01-26 15:08:41.946 183407 DEBUG oslo_concurrency.processutils [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/66a7af21-1abe-467f-b739-441e05a4b09a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:08:42 compute-1 nova_compute[183403]: 2026-01-26 15:08:42.014 183407 DEBUG oslo_concurrency.processutils [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/66a7af21-1abe-467f-b739-441e05a4b09a/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:08:42 compute-1 nova_compute[183403]: 2026-01-26 15:08:42.015 183407 DEBUG nova.virt.disk.api [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Cannot resize image /var/lib/nova/instances/66a7af21-1abe-467f-b739-441e05a4b09a/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 26 15:08:42 compute-1 nova_compute[183403]: 2026-01-26 15:08:42.015 183407 DEBUG nova.virt.libvirt.driver [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Jan 26 15:08:42 compute-1 nova_compute[183403]: 2026-01-26 15:08:42.015 183407 DEBUG nova.virt.libvirt.driver [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Ensure instance console log exists: /var/lib/nova/instances/66a7af21-1abe-467f-b739-441e05a4b09a/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Jan 26 15:08:42 compute-1 nova_compute[183403]: 2026-01-26 15:08:42.016 183407 DEBUG oslo_concurrency.lockutils [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:08:42 compute-1 nova_compute[183403]: 2026-01-26 15:08:42.016 183407 DEBUG oslo_concurrency.lockutils [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:08:42 compute-1 nova_compute[183403]: 2026-01-26 15:08:42.016 183407 DEBUG oslo_concurrency.lockutils [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:08:42 compute-1 nova_compute[183403]: 2026-01-26 15:08:42.132 183407 DEBUG nova.network.neutron [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Successfully updated port: 1fd7a551-45a6-412c-abb2-e2d57c2b25e8 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Jan 26 15:08:42 compute-1 nova_compute[183403]: 2026-01-26 15:08:42.196 183407 DEBUG nova.compute.manager [req-df92d16b-4985-4f54-a2cf-0f63f74ecd47 req-56a4983c-eb9d-423a-9be6-a74e8aa5f316 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Received event network-changed-1fd7a551-45a6-412c-abb2-e2d57c2b25e8 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:08:42 compute-1 nova_compute[183403]: 2026-01-26 15:08:42.196 183407 DEBUG nova.compute.manager [req-df92d16b-4985-4f54-a2cf-0f63f74ecd47 req-56a4983c-eb9d-423a-9be6-a74e8aa5f316 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Refreshing instance network info cache due to event network-changed-1fd7a551-45a6-412c-abb2-e2d57c2b25e8. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 26 15:08:42 compute-1 nova_compute[183403]: 2026-01-26 15:08:42.197 183407 DEBUG oslo_concurrency.lockutils [req-df92d16b-4985-4f54-a2cf-0f63f74ecd47 req-56a4983c-eb9d-423a-9be6-a74e8aa5f316 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "refresh_cache-66a7af21-1abe-467f-b739-441e05a4b09a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 15:08:42 compute-1 nova_compute[183403]: 2026-01-26 15:08:42.198 183407 DEBUG oslo_concurrency.lockutils [req-df92d16b-4985-4f54-a2cf-0f63f74ecd47 req-56a4983c-eb9d-423a-9be6-a74e8aa5f316 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquired lock "refresh_cache-66a7af21-1abe-467f-b739-441e05a4b09a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 15:08:42 compute-1 nova_compute[183403]: 2026-01-26 15:08:42.198 183407 DEBUG nova.network.neutron [req-df92d16b-4985-4f54-a2cf-0f63f74ecd47 req-56a4983c-eb9d-423a-9be6-a74e8aa5f316 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Refreshing network info cache for port 1fd7a551-45a6-412c-abb2-e2d57c2b25e8 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 26 15:08:42 compute-1 nova_compute[183403]: 2026-01-26 15:08:42.642 183407 DEBUG oslo_concurrency.lockutils [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Acquiring lock "refresh_cache-66a7af21-1abe-467f-b739-441e05a4b09a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 15:08:42 compute-1 nova_compute[183403]: 2026-01-26 15:08:42.850 183407 WARNING neutronclient.v2_0.client [req-df92d16b-4985-4f54-a2cf-0f63f74ecd47 req-56a4983c-eb9d-423a-9be6-a74e8aa5f316 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:08:43 compute-1 nova_compute[183403]: 2026-01-26 15:08:43.028 183407 DEBUG nova.network.neutron [req-df92d16b-4985-4f54-a2cf-0f63f74ecd47 req-56a4983c-eb9d-423a-9be6-a74e8aa5f316 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 15:08:43 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:08:43.093 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=41380b5a-e321-4ce4-bcc6-ecd563b3c793, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:08:43 compute-1 nova_compute[183403]: 2026-01-26 15:08:43.181 183407 DEBUG nova.network.neutron [req-df92d16b-4985-4f54-a2cf-0f63f74ecd47 req-56a4983c-eb9d-423a-9be6-a74e8aa5f316 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:08:43 compute-1 nova_compute[183403]: 2026-01-26 15:08:43.365 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:08:43 compute-1 nova_compute[183403]: 2026-01-26 15:08:43.688 183407 DEBUG oslo_concurrency.lockutils [req-df92d16b-4985-4f54-a2cf-0f63f74ecd47 req-56a4983c-eb9d-423a-9be6-a74e8aa5f316 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Releasing lock "refresh_cache-66a7af21-1abe-467f-b739-441e05a4b09a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 15:08:43 compute-1 nova_compute[183403]: 2026-01-26 15:08:43.689 183407 DEBUG oslo_concurrency.lockutils [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Acquired lock "refresh_cache-66a7af21-1abe-467f-b739-441e05a4b09a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 15:08:43 compute-1 nova_compute[183403]: 2026-01-26 15:08:43.690 183407 DEBUG nova.network.neutron [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 15:08:44 compute-1 podman[205273]: 2026-01-26 15:08:44.899449597 +0000 UTC m=+0.067326546 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-type=git, name=ubi9-minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, architecture=x86_64, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 26 15:08:44 compute-1 podman[205272]: 2026-01-26 15:08:44.914239389 +0000 UTC m=+0.081355980 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 26 15:08:45 compute-1 nova_compute[183403]: 2026-01-26 15:08:45.061 183407 DEBUG nova.network.neutron [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 15:08:46 compute-1 nova_compute[183403]: 2026-01-26 15:08:46.146 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:08:46 compute-1 nova_compute[183403]: 2026-01-26 15:08:46.177 183407 WARNING neutronclient.v2_0.client [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:08:46 compute-1 nova_compute[183403]: 2026-01-26 15:08:46.672 183407 DEBUG nova.network.neutron [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Updating instance_info_cache with network_info: [{"id": "1fd7a551-45a6-412c-abb2-e2d57c2b25e8", "address": "fa:16:3e:da:59:eb", "network": {"id": "d4a37c9f-5b64-4f94-80e9-126c911b1acf", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7dc5e9070a084dfcb543a08e87868f39", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1fd7a551-45", "ovs_interfaceid": "1fd7a551-45a6-412c-abb2-e2d57c2b25e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:08:47 compute-1 nova_compute[183403]: 2026-01-26 15:08:47.178 183407 DEBUG oslo_concurrency.lockutils [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Releasing lock "refresh_cache-66a7af21-1abe-467f-b739-441e05a4b09a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 15:08:47 compute-1 nova_compute[183403]: 2026-01-26 15:08:47.179 183407 DEBUG nova.compute.manager [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Instance network_info: |[{"id": "1fd7a551-45a6-412c-abb2-e2d57c2b25e8", "address": "fa:16:3e:da:59:eb", "network": {"id": "d4a37c9f-5b64-4f94-80e9-126c911b1acf", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7dc5e9070a084dfcb543a08e87868f39", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1fd7a551-45", "ovs_interfaceid": "1fd7a551-45a6-412c-abb2-e2d57c2b25e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Jan 26 15:08:47 compute-1 nova_compute[183403]: 2026-01-26 15:08:47.180 183407 DEBUG nova.virt.libvirt.driver [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Start _get_guest_xml network_info=[{"id": "1fd7a551-45a6-412c-abb2-e2d57c2b25e8", "address": "fa:16:3e:da:59:eb", "network": {"id": "d4a37c9f-5b64-4f94-80e9-126c911b1acf", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7dc5e9070a084dfcb543a08e87868f39", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1fd7a551-45", "ovs_interfaceid": "1fd7a551-45a6-412c-abb2-e2d57c2b25e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T15:01:22Z,direct_url=<?>,disk_format='qcow2',id=354e4d0e-4287-404f-93d3-2c85cfe92fbc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='179f3c996d8f4e7ea1b0aca3ec76f02e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T15:01:23Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'image_id': '354e4d0e-4287-404f-93d3-2c85cfe92fbc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Jan 26 15:08:47 compute-1 nova_compute[183403]: 2026-01-26 15:08:47.184 183407 WARNING nova.virt.libvirt.driver [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:08:47 compute-1 nova_compute[183403]: 2026-01-26 15:08:47.185 183407 DEBUG nova.virt.driver [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='354e4d0e-4287-404f-93d3-2c85cfe92fbc', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteActionsViaActuator-server-804615053', uuid='66a7af21-1abe-467f-b739-441e05a4b09a'), owner=OwnerMeta(userid='afb4f4811cb043dca89a8413c390ba3d', username='tempest-TestExecuteActionsViaActuator-280856547-project-admin', projectid='6377892a338d4a7cbe63cf30bd2c63ea', projectname='tempest-TestExecuteActionsViaActuator-280856547'), image=ImageMeta(id='354e4d0e-4287-404f-93d3-2c85cfe92fbc', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='74480e15-23e6-4569-8ef9-3ddf5ac8b981', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "1fd7a551-45a6-412c-abb2-e2d57c2b25e8", "address": "fa:16:3e:da:59:eb", "network": {"id": "d4a37c9f-5b64-4f94-80e9-126c911b1acf", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7dc5e9070a084dfcb543a08e87868f39", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1fd7a551-45", "ovs_interfaceid": "1fd7a551-45a6-412c-abb2-e2d57c2b25e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1769440127.1855636) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Jan 26 15:08:47 compute-1 nova_compute[183403]: 2026-01-26 15:08:47.194 183407 DEBUG nova.virt.libvirt.host [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Jan 26 15:08:47 compute-1 nova_compute[183403]: 2026-01-26 15:08:47.194 183407 DEBUG nova.virt.libvirt.host [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Jan 26 15:08:47 compute-1 nova_compute[183403]: 2026-01-26 15:08:47.198 183407 DEBUG nova.virt.libvirt.host [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Jan 26 15:08:47 compute-1 nova_compute[183403]: 2026-01-26 15:08:47.199 183407 DEBUG nova.virt.libvirt.host [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Jan 26 15:08:47 compute-1 nova_compute[183403]: 2026-01-26 15:08:47.200 183407 DEBUG nova.virt.libvirt.driver [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Jan 26 15:08:47 compute-1 nova_compute[183403]: 2026-01-26 15:08:47.200 183407 DEBUG nova.virt.hardware [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T15:01:21Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='74480e15-23e6-4569-8ef9-3ddf5ac8b981',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T15:01:22Z,direct_url=<?>,disk_format='qcow2',id=354e4d0e-4287-404f-93d3-2c85cfe92fbc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='179f3c996d8f4e7ea1b0aca3ec76f02e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T15:01:23Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Jan 26 15:08:47 compute-1 nova_compute[183403]: 2026-01-26 15:08:47.200 183407 DEBUG nova.virt.hardware [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Jan 26 15:08:47 compute-1 nova_compute[183403]: 2026-01-26 15:08:47.201 183407 DEBUG nova.virt.hardware [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Jan 26 15:08:47 compute-1 nova_compute[183403]: 2026-01-26 15:08:47.201 183407 DEBUG nova.virt.hardware [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Jan 26 15:08:47 compute-1 nova_compute[183403]: 2026-01-26 15:08:47.201 183407 DEBUG nova.virt.hardware [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Jan 26 15:08:47 compute-1 nova_compute[183403]: 2026-01-26 15:08:47.201 183407 DEBUG nova.virt.hardware [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Jan 26 15:08:47 compute-1 nova_compute[183403]: 2026-01-26 15:08:47.202 183407 DEBUG nova.virt.hardware [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Jan 26 15:08:47 compute-1 nova_compute[183403]: 2026-01-26 15:08:47.202 183407 DEBUG nova.virt.hardware [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Jan 26 15:08:47 compute-1 nova_compute[183403]: 2026-01-26 15:08:47.202 183407 DEBUG nova.virt.hardware [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Jan 26 15:08:47 compute-1 nova_compute[183403]: 2026-01-26 15:08:47.202 183407 DEBUG nova.virt.hardware [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Jan 26 15:08:47 compute-1 nova_compute[183403]: 2026-01-26 15:08:47.203 183407 DEBUG nova.virt.hardware [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Jan 26 15:08:47 compute-1 nova_compute[183403]: 2026-01-26 15:08:47.206 183407 DEBUG nova.virt.libvirt.vif [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-01-26T15:08:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-804615053',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-804615053',id=5,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6377892a338d4a7cbe63cf30bd2c63ea',ramdisk_id='',reservation_id='r-hxr43d02',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,reader,member,admin',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-280856547',owner_user_name='tempest-TestExecuteActionsViaActuator-280856547-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T15:08:40Z,user_data=None,user_id='afb4f4811cb043dca89a8413c390ba3d',uuid=66a7af21-1abe-467f-b739-441e05a4b09a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1fd7a551-45a6-412c-abb2-e2d57c2b25e8", "address": "fa:16:3e:da:59:eb", "network": {"id": "d4a37c9f-5b64-4f94-80e9-126c911b1acf", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7dc5e9070a084dfcb543a08e87868f39", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1fd7a551-45", "ovs_interfaceid": "1fd7a551-45a6-412c-abb2-e2d57c2b25e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 26 15:08:47 compute-1 nova_compute[183403]: 2026-01-26 15:08:47.206 183407 DEBUG nova.network.os_vif_util [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Converting VIF {"id": "1fd7a551-45a6-412c-abb2-e2d57c2b25e8", "address": "fa:16:3e:da:59:eb", "network": {"id": "d4a37c9f-5b64-4f94-80e9-126c911b1acf", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7dc5e9070a084dfcb543a08e87868f39", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1fd7a551-45", "ovs_interfaceid": "1fd7a551-45a6-412c-abb2-e2d57c2b25e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:08:47 compute-1 nova_compute[183403]: 2026-01-26 15:08:47.207 183407 DEBUG nova.network.os_vif_util [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:da:59:eb,bridge_name='br-int',has_traffic_filtering=True,id=1fd7a551-45a6-412c-abb2-e2d57c2b25e8,network=Network(d4a37c9f-5b64-4f94-80e9-126c911b1acf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1fd7a551-45') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:08:47 compute-1 nova_compute[183403]: 2026-01-26 15:08:47.207 183407 DEBUG nova.objects.instance [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lazy-loading 'pci_devices' on Instance uuid 66a7af21-1abe-467f-b739-441e05a4b09a obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:08:47 compute-1 nova_compute[183403]: 2026-01-26 15:08:47.716 183407 DEBUG nova.virt.libvirt.driver [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] End _get_guest_xml xml=<domain type="kvm">
Jan 26 15:08:47 compute-1 nova_compute[183403]:   <uuid>66a7af21-1abe-467f-b739-441e05a4b09a</uuid>
Jan 26 15:08:47 compute-1 nova_compute[183403]:   <name>instance-00000005</name>
Jan 26 15:08:47 compute-1 nova_compute[183403]:   <memory>131072</memory>
Jan 26 15:08:47 compute-1 nova_compute[183403]:   <vcpu>1</vcpu>
Jan 26 15:08:47 compute-1 nova_compute[183403]:   <metadata>
Jan 26 15:08:47 compute-1 nova_compute[183403]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 15:08:47 compute-1 nova_compute[183403]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 15:08:47 compute-1 nova_compute[183403]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-804615053</nova:name>
Jan 26 15:08:47 compute-1 nova_compute[183403]:       <nova:creationTime>2026-01-26 15:08:47</nova:creationTime>
Jan 26 15:08:47 compute-1 nova_compute[183403]:       <nova:flavor name="m1.nano" id="74480e15-23e6-4569-8ef9-3ddf5ac8b981">
Jan 26 15:08:47 compute-1 nova_compute[183403]:         <nova:memory>128</nova:memory>
Jan 26 15:08:47 compute-1 nova_compute[183403]:         <nova:disk>1</nova:disk>
Jan 26 15:08:47 compute-1 nova_compute[183403]:         <nova:swap>0</nova:swap>
Jan 26 15:08:47 compute-1 nova_compute[183403]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 15:08:47 compute-1 nova_compute[183403]:         <nova:vcpus>1</nova:vcpus>
Jan 26 15:08:47 compute-1 nova_compute[183403]:         <nova:extraSpecs>
Jan 26 15:08:47 compute-1 nova_compute[183403]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 15:08:47 compute-1 nova_compute[183403]:         </nova:extraSpecs>
Jan 26 15:08:47 compute-1 nova_compute[183403]:       </nova:flavor>
Jan 26 15:08:47 compute-1 nova_compute[183403]:       <nova:image uuid="354e4d0e-4287-404f-93d3-2c85cfe92fbc">
Jan 26 15:08:47 compute-1 nova_compute[183403]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 15:08:47 compute-1 nova_compute[183403]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 15:08:47 compute-1 nova_compute[183403]:         <nova:minDisk>1</nova:minDisk>
Jan 26 15:08:47 compute-1 nova_compute[183403]:         <nova:minRam>0</nova:minRam>
Jan 26 15:08:47 compute-1 nova_compute[183403]:         <nova:properties>
Jan 26 15:08:47 compute-1 nova_compute[183403]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 15:08:47 compute-1 nova_compute[183403]:         </nova:properties>
Jan 26 15:08:47 compute-1 nova_compute[183403]:       </nova:image>
Jan 26 15:08:47 compute-1 nova_compute[183403]:       <nova:owner>
Jan 26 15:08:47 compute-1 nova_compute[183403]:         <nova:user uuid="afb4f4811cb043dca89a8413c390ba3d">tempest-TestExecuteActionsViaActuator-280856547-project-admin</nova:user>
Jan 26 15:08:47 compute-1 nova_compute[183403]:         <nova:project uuid="6377892a338d4a7cbe63cf30bd2c63ea">tempest-TestExecuteActionsViaActuator-280856547</nova:project>
Jan 26 15:08:47 compute-1 nova_compute[183403]:       </nova:owner>
Jan 26 15:08:47 compute-1 nova_compute[183403]:       <nova:root type="image" uuid="354e4d0e-4287-404f-93d3-2c85cfe92fbc"/>
Jan 26 15:08:47 compute-1 nova_compute[183403]:       <nova:ports>
Jan 26 15:08:47 compute-1 nova_compute[183403]:         <nova:port uuid="1fd7a551-45a6-412c-abb2-e2d57c2b25e8">
Jan 26 15:08:47 compute-1 nova_compute[183403]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 26 15:08:47 compute-1 nova_compute[183403]:         </nova:port>
Jan 26 15:08:47 compute-1 nova_compute[183403]:       </nova:ports>
Jan 26 15:08:47 compute-1 nova_compute[183403]:     </nova:instance>
Jan 26 15:08:47 compute-1 nova_compute[183403]:   </metadata>
Jan 26 15:08:47 compute-1 nova_compute[183403]:   <sysinfo type="smbios">
Jan 26 15:08:47 compute-1 nova_compute[183403]:     <system>
Jan 26 15:08:47 compute-1 nova_compute[183403]:       <entry name="manufacturer">RDO</entry>
Jan 26 15:08:47 compute-1 nova_compute[183403]:       <entry name="product">OpenStack Compute</entry>
Jan 26 15:08:47 compute-1 nova_compute[183403]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 15:08:47 compute-1 nova_compute[183403]:       <entry name="serial">66a7af21-1abe-467f-b739-441e05a4b09a</entry>
Jan 26 15:08:47 compute-1 nova_compute[183403]:       <entry name="uuid">66a7af21-1abe-467f-b739-441e05a4b09a</entry>
Jan 26 15:08:47 compute-1 nova_compute[183403]:       <entry name="family">Virtual Machine</entry>
Jan 26 15:08:47 compute-1 nova_compute[183403]:     </system>
Jan 26 15:08:47 compute-1 nova_compute[183403]:   </sysinfo>
Jan 26 15:08:47 compute-1 nova_compute[183403]:   <os>
Jan 26 15:08:47 compute-1 nova_compute[183403]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 15:08:47 compute-1 nova_compute[183403]:     <boot dev="hd"/>
Jan 26 15:08:47 compute-1 nova_compute[183403]:     <smbios mode="sysinfo"/>
Jan 26 15:08:47 compute-1 nova_compute[183403]:   </os>
Jan 26 15:08:47 compute-1 nova_compute[183403]:   <features>
Jan 26 15:08:47 compute-1 nova_compute[183403]:     <acpi/>
Jan 26 15:08:47 compute-1 nova_compute[183403]:     <apic/>
Jan 26 15:08:47 compute-1 nova_compute[183403]:     <vmcoreinfo/>
Jan 26 15:08:47 compute-1 nova_compute[183403]:   </features>
Jan 26 15:08:47 compute-1 nova_compute[183403]:   <clock offset="utc">
Jan 26 15:08:47 compute-1 nova_compute[183403]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 15:08:47 compute-1 nova_compute[183403]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 15:08:47 compute-1 nova_compute[183403]:     <timer name="hpet" present="no"/>
Jan 26 15:08:47 compute-1 nova_compute[183403]:   </clock>
Jan 26 15:08:47 compute-1 nova_compute[183403]:   <cpu mode="custom" match="exact">
Jan 26 15:08:47 compute-1 nova_compute[183403]:     <model>Nehalem</model>
Jan 26 15:08:47 compute-1 nova_compute[183403]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 15:08:47 compute-1 nova_compute[183403]:   </cpu>
Jan 26 15:08:47 compute-1 nova_compute[183403]:   <devices>
Jan 26 15:08:47 compute-1 nova_compute[183403]:     <disk type="file" device="disk">
Jan 26 15:08:47 compute-1 nova_compute[183403]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 15:08:47 compute-1 nova_compute[183403]:       <source file="/var/lib/nova/instances/66a7af21-1abe-467f-b739-441e05a4b09a/disk"/>
Jan 26 15:08:47 compute-1 nova_compute[183403]:       <target dev="vda" bus="virtio"/>
Jan 26 15:08:47 compute-1 nova_compute[183403]:     </disk>
Jan 26 15:08:47 compute-1 nova_compute[183403]:     <disk type="file" device="cdrom">
Jan 26 15:08:47 compute-1 nova_compute[183403]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 15:08:47 compute-1 nova_compute[183403]:       <source file="/var/lib/nova/instances/66a7af21-1abe-467f-b739-441e05a4b09a/disk.config"/>
Jan 26 15:08:47 compute-1 nova_compute[183403]:       <target dev="sda" bus="sata"/>
Jan 26 15:08:47 compute-1 nova_compute[183403]:     </disk>
Jan 26 15:08:47 compute-1 nova_compute[183403]:     <interface type="ethernet">
Jan 26 15:08:47 compute-1 nova_compute[183403]:       <mac address="fa:16:3e:da:59:eb"/>
Jan 26 15:08:47 compute-1 nova_compute[183403]:       <model type="virtio"/>
Jan 26 15:08:47 compute-1 nova_compute[183403]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 15:08:47 compute-1 nova_compute[183403]:       <mtu size="1442"/>
Jan 26 15:08:47 compute-1 nova_compute[183403]:       <target dev="tap1fd7a551-45"/>
Jan 26 15:08:47 compute-1 nova_compute[183403]:     </interface>
Jan 26 15:08:47 compute-1 nova_compute[183403]:     <serial type="pty">
Jan 26 15:08:47 compute-1 nova_compute[183403]:       <log file="/var/lib/nova/instances/66a7af21-1abe-467f-b739-441e05a4b09a/console.log" append="off"/>
Jan 26 15:08:47 compute-1 nova_compute[183403]:     </serial>
Jan 26 15:08:47 compute-1 nova_compute[183403]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 15:08:47 compute-1 nova_compute[183403]:     <video>
Jan 26 15:08:47 compute-1 nova_compute[183403]:       <model type="virtio"/>
Jan 26 15:08:47 compute-1 nova_compute[183403]:     </video>
Jan 26 15:08:47 compute-1 nova_compute[183403]:     <input type="tablet" bus="usb"/>
Jan 26 15:08:47 compute-1 nova_compute[183403]:     <rng model="virtio">
Jan 26 15:08:47 compute-1 nova_compute[183403]:       <backend model="random">/dev/urandom</backend>
Jan 26 15:08:47 compute-1 nova_compute[183403]:     </rng>
Jan 26 15:08:47 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root"/>
Jan 26 15:08:47 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:08:47 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:08:47 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:08:47 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:08:47 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:08:47 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:08:47 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:08:47 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:08:47 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:08:47 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:08:47 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:08:47 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:08:47 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:08:47 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:08:47 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:08:47 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:08:47 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:08:47 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:08:47 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:08:47 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:08:47 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:08:47 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:08:47 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:08:47 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:08:47 compute-1 nova_compute[183403]:     <controller type="usb" index="0"/>
Jan 26 15:08:47 compute-1 nova_compute[183403]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 15:08:47 compute-1 nova_compute[183403]:       <stats period="10"/>
Jan 26 15:08:47 compute-1 nova_compute[183403]:     </memballoon>
Jan 26 15:08:47 compute-1 nova_compute[183403]:   </devices>
Jan 26 15:08:47 compute-1 nova_compute[183403]: </domain>
Jan 26 15:08:47 compute-1 nova_compute[183403]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Jan 26 15:08:47 compute-1 nova_compute[183403]: 2026-01-26 15:08:47.717 183407 DEBUG nova.compute.manager [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Preparing to wait for external event network-vif-plugged-1fd7a551-45a6-412c-abb2-e2d57c2b25e8 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 26 15:08:47 compute-1 nova_compute[183403]: 2026-01-26 15:08:47.718 183407 DEBUG oslo_concurrency.lockutils [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Acquiring lock "66a7af21-1abe-467f-b739-441e05a4b09a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:08:47 compute-1 nova_compute[183403]: 2026-01-26 15:08:47.718 183407 DEBUG oslo_concurrency.lockutils [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "66a7af21-1abe-467f-b739-441e05a4b09a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:08:47 compute-1 nova_compute[183403]: 2026-01-26 15:08:47.718 183407 DEBUG oslo_concurrency.lockutils [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "66a7af21-1abe-467f-b739-441e05a4b09a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:08:47 compute-1 nova_compute[183403]: 2026-01-26 15:08:47.719 183407 DEBUG nova.virt.libvirt.vif [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-01-26T15:08:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-804615053',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-804615053',id=5,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6377892a338d4a7cbe63cf30bd2c63ea',ramdisk_id='',reservation_id='r-hxr43d02',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,reader,member,admin',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-280856547',owner_user_name='tempest-TestExecuteActionsViaActuator-280856547-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T15:08:40Z,user_data=None,user_id='afb4f4811cb043dca89a8413c390ba3d',uuid=66a7af21-1abe-467f-b739-441e05a4b09a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1fd7a551-45a6-412c-abb2-e2d57c2b25e8", "address": "fa:16:3e:da:59:eb", "network": {"id": "d4a37c9f-5b64-4f94-80e9-126c911b1acf", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7dc5e9070a084dfcb543a08e87868f39", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1fd7a551-45", "ovs_interfaceid": "1fd7a551-45a6-412c-abb2-e2d57c2b25e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 26 15:08:47 compute-1 nova_compute[183403]: 2026-01-26 15:08:47.719 183407 DEBUG nova.network.os_vif_util [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Converting VIF {"id": "1fd7a551-45a6-412c-abb2-e2d57c2b25e8", "address": "fa:16:3e:da:59:eb", "network": {"id": "d4a37c9f-5b64-4f94-80e9-126c911b1acf", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7dc5e9070a084dfcb543a08e87868f39", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1fd7a551-45", "ovs_interfaceid": "1fd7a551-45a6-412c-abb2-e2d57c2b25e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:08:47 compute-1 nova_compute[183403]: 2026-01-26 15:08:47.720 183407 DEBUG nova.network.os_vif_util [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:da:59:eb,bridge_name='br-int',has_traffic_filtering=True,id=1fd7a551-45a6-412c-abb2-e2d57c2b25e8,network=Network(d4a37c9f-5b64-4f94-80e9-126c911b1acf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1fd7a551-45') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:08:47 compute-1 nova_compute[183403]: 2026-01-26 15:08:47.720 183407 DEBUG os_vif [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:da:59:eb,bridge_name='br-int',has_traffic_filtering=True,id=1fd7a551-45a6-412c-abb2-e2d57c2b25e8,network=Network(d4a37c9f-5b64-4f94-80e9-126c911b1acf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1fd7a551-45') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 26 15:08:47 compute-1 nova_compute[183403]: 2026-01-26 15:08:47.721 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:08:47 compute-1 nova_compute[183403]: 2026-01-26 15:08:47.721 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:08:47 compute-1 nova_compute[183403]: 2026-01-26 15:08:47.721 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:08:47 compute-1 nova_compute[183403]: 2026-01-26 15:08:47.722 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:08:47 compute-1 nova_compute[183403]: 2026-01-26 15:08:47.722 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '640a76a8-5875-5155-9295-5ec1bbe6c3ed', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:08:47 compute-1 nova_compute[183403]: 2026-01-26 15:08:47.723 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:08:47 compute-1 nova_compute[183403]: 2026-01-26 15:08:47.725 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:08:47 compute-1 nova_compute[183403]: 2026-01-26 15:08:47.728 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:08:47 compute-1 nova_compute[183403]: 2026-01-26 15:08:47.729 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1fd7a551-45, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:08:47 compute-1 nova_compute[183403]: 2026-01-26 15:08:47.730 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap1fd7a551-45, col_values=(('qos', UUID('715b2da4-57f5-4aa8-962d-81f147f979ad')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:08:47 compute-1 nova_compute[183403]: 2026-01-26 15:08:47.730 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap1fd7a551-45, col_values=(('external_ids', {'iface-id': '1fd7a551-45a6-412c-abb2-e2d57c2b25e8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:da:59:eb', 'vm-uuid': '66a7af21-1abe-467f-b739-441e05a4b09a'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:08:47 compute-1 nova_compute[183403]: 2026-01-26 15:08:47.732 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:08:47 compute-1 NetworkManager[55716]: <info>  [1769440127.7330] manager: (tap1fd7a551-45): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/24)
Jan 26 15:08:47 compute-1 nova_compute[183403]: 2026-01-26 15:08:47.736 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:08:47 compute-1 nova_compute[183403]: 2026-01-26 15:08:47.741 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:08:47 compute-1 nova_compute[183403]: 2026-01-26 15:08:47.742 183407 INFO os_vif [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:da:59:eb,bridge_name='br-int',has_traffic_filtering=True,id=1fd7a551-45a6-412c-abb2-e2d57c2b25e8,network=Network(d4a37c9f-5b64-4f94-80e9-126c911b1acf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1fd7a551-45')
Jan 26 15:08:48 compute-1 nova_compute[183403]: 2026-01-26 15:08:48.366 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:08:49 compute-1 nova_compute[183403]: 2026-01-26 15:08:49.291 183407 DEBUG nova.virt.libvirt.driver [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 15:08:49 compute-1 nova_compute[183403]: 2026-01-26 15:08:49.292 183407 DEBUG nova.virt.libvirt.driver [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 15:08:49 compute-1 nova_compute[183403]: 2026-01-26 15:08:49.292 183407 DEBUG nova.virt.libvirt.driver [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] No VIF found with MAC fa:16:3e:da:59:eb, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Jan 26 15:08:49 compute-1 nova_compute[183403]: 2026-01-26 15:08:49.292 183407 INFO nova.virt.libvirt.driver [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Using config drive
Jan 26 15:08:49 compute-1 openstack_network_exporter[195610]: ERROR   15:08:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:08:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:08:49 compute-1 openstack_network_exporter[195610]: ERROR   15:08:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:08:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:08:49 compute-1 nova_compute[183403]: 2026-01-26 15:08:49.804 183407 WARNING neutronclient.v2_0.client [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:08:50 compute-1 sshd-session[205254]: Disconnecting authenticating user sync 185.246.128.170 port 27565: Change of username or service not allowed: (sync,ssh-connection) -> (qemu,ssh-connection) [preauth]
Jan 26 15:08:50 compute-1 nova_compute[183403]: 2026-01-26 15:08:50.158 183407 INFO nova.virt.libvirt.driver [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Creating config drive at /var/lib/nova/instances/66a7af21-1abe-467f-b739-441e05a4b09a/disk.config
Jan 26 15:08:50 compute-1 nova_compute[183403]: 2026-01-26 15:08:50.164 183407 DEBUG oslo_concurrency.processutils [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/66a7af21-1abe-467f-b739-441e05a4b09a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp5sv6eus1 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:08:50 compute-1 nova_compute[183403]: 2026-01-26 15:08:50.294 183407 DEBUG oslo_concurrency.processutils [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/66a7af21-1abe-467f-b739-441e05a4b09a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp5sv6eus1" returned: 0 in 0.130s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:08:50 compute-1 kernel: tap1fd7a551-45: entered promiscuous mode
Jan 26 15:08:50 compute-1 ovn_controller[95641]: 2026-01-26T15:08:50Z|00049|binding|INFO|Claiming lport 1fd7a551-45a6-412c-abb2-e2d57c2b25e8 for this chassis.
Jan 26 15:08:50 compute-1 ovn_controller[95641]: 2026-01-26T15:08:50Z|00050|binding|INFO|1fd7a551-45a6-412c-abb2-e2d57c2b25e8: Claiming fa:16:3e:da:59:eb 10.100.0.9
Jan 26 15:08:50 compute-1 NetworkManager[55716]: <info>  [1769440130.3723] manager: (tap1fd7a551-45): new Tun device (/org/freedesktop/NetworkManager/Devices/25)
Jan 26 15:08:50 compute-1 nova_compute[183403]: 2026-01-26 15:08:50.371 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:08:50 compute-1 nova_compute[183403]: 2026-01-26 15:08:50.375 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:08:50 compute-1 nova_compute[183403]: 2026-01-26 15:08:50.380 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:08:50.387 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:da:59:eb 10.100.0.9'], port_security=['fa:16:3e:da:59:eb 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '66a7af21-1abe-467f-b739-441e05a4b09a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d4a37c9f-5b64-4f94-80e9-126c911b1acf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6377892a338d4a7cbe63cf30bd2c63ea', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6ec487f2-f407-43f7-8fd3-02f4d5e73158', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=922d1c2a-bc46-47ee-81d5-242719303ef7, chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], logical_port=1fd7a551-45a6-412c-abb2-e2d57c2b25e8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:08:50.388 104930 INFO neutron.agent.ovn.metadata.agent [-] Port 1fd7a551-45a6-412c-abb2-e2d57c2b25e8 in datapath d4a37c9f-5b64-4f94-80e9-126c911b1acf bound to our chassis
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:08:50.389 104930 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d4a37c9f-5b64-4f94-80e9-126c911b1acf
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:08:50.403 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[d9ef9535-d4dc-42c1-b3b2-80a50d957b2c]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:08:50.403 104930 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd4a37c9f-51 in ovnmeta-d4a37c9f-5b64-4f94-80e9-126c911b1acf namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:08:50.411 203506 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd4a37c9f-50 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:08:50.411 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[70ac3a35-4450-49bd-a27d-521992911bcd]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:08:50.412 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[f8643ab3-6ded-4206-a32a-cec7be0edd13]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:08:50 compute-1 systemd-machined[154697]: New machine qemu-2-instance-00000005.
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:08:50.423 105448 DEBUG oslo.privsep.daemon [-] privsep: reply[dac8e2d7-482f-49b2-be71-2af23c4cf3ff]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:08:50.450 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[728f2ca4-8017-4d37-afd9-92a11df215d3]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:08:50 compute-1 nova_compute[183403]: 2026-01-26 15:08:50.453 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:08:50 compute-1 systemd[1]: Started Virtual Machine qemu-2-instance-00000005.
Jan 26 15:08:50 compute-1 ovn_controller[95641]: 2026-01-26T15:08:50Z|00051|binding|INFO|Setting lport 1fd7a551-45a6-412c-abb2-e2d57c2b25e8 ovn-installed in OVS
Jan 26 15:08:50 compute-1 ovn_controller[95641]: 2026-01-26T15:08:50Z|00052|binding|INFO|Setting lport 1fd7a551-45a6-412c-abb2-e2d57c2b25e8 up in Southbound
Jan 26 15:08:50 compute-1 nova_compute[183403]: 2026-01-26 15:08:50.459 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:08:50 compute-1 systemd-udevd[205341]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:08:50.480 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[2675ea71-f634-4761-bd80-955a80e675fe]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:08:50 compute-1 NetworkManager[55716]: <info>  [1769440130.4849] manager: (tapd4a37c9f-50): new Veth device (/org/freedesktop/NetworkManager/Devices/26)
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:08:50.483 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[ec2764b3-8856-4267-a481-c2baf298ab4e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:08:50 compute-1 systemd-udevd[205343]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 15:08:50 compute-1 NetworkManager[55716]: <info>  [1769440130.4921] device (tap1fd7a551-45): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 15:08:50 compute-1 NetworkManager[55716]: <info>  [1769440130.4927] device (tap1fd7a551-45): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:08:50.521 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[46c80d56-7a52-4f12-bed5-0b7daa4364ad]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:08:50.524 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[ca5deec6-ed9c-4d12-af14-eeb48834a5eb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:08:50 compute-1 NetworkManager[55716]: <info>  [1769440130.5438] device (tapd4a37c9f-50): carrier: link connected
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:08:50.550 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[3847eefb-62f9-4ae1-95ca-ac156c48e710]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:08:50.565 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[d8c94c65-25c6-4ab6-a758-2de9d0d287a4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd4a37c9f-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:53:55:45'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 385314, 'reachable_time': 28113, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 205369, 'error': None, 'target': 'ovnmeta-d4a37c9f-5b64-4f94-80e9-126c911b1acf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:08:50.578 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[972f5431-0cd6-408f-8bd8-a70b1a72d9fb]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe53:5545'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 385314, 'tstamp': 385314}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 205370, 'error': None, 'target': 'ovnmeta-d4a37c9f-5b64-4f94-80e9-126c911b1acf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:08:50.596 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[f43a575a-c895-498a-ab79-1f54c4c68c63]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd4a37c9f-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:53:55:45'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 385314, 'reachable_time': 28113, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 205371, 'error': None, 'target': 'ovnmeta-d4a37c9f-5b64-4f94-80e9-126c911b1acf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:08:50.620 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[4caedbb9-3d0b-418a-98db-45bcb9967f07]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:08:50.673 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[14b943ca-bfc1-4370-8507-baeafa0c8b60]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:08:50.674 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd4a37c9f-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:08:50.674 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:08:50.674 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd4a37c9f-50, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:08:50 compute-1 nova_compute[183403]: 2026-01-26 15:08:50.676 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:08:50 compute-1 NetworkManager[55716]: <info>  [1769440130.6764] manager: (tapd4a37c9f-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/27)
Jan 26 15:08:50 compute-1 kernel: tapd4a37c9f-50: entered promiscuous mode
Jan 26 15:08:50 compute-1 nova_compute[183403]: 2026-01-26 15:08:50.679 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:08:50.680 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd4a37c9f-50, col_values=(('external_ids', {'iface-id': '3415b7f1-5b64-48d1-b20f-4c68422efc0e'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:08:50 compute-1 nova_compute[183403]: 2026-01-26 15:08:50.680 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:08:50 compute-1 ovn_controller[95641]: 2026-01-26T15:08:50Z|00053|binding|INFO|Releasing lport 3415b7f1-5b64-48d1-b20f-4c68422efc0e from this chassis (sb_readonly=0)
Jan 26 15:08:50 compute-1 nova_compute[183403]: 2026-01-26 15:08:50.693 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:08:50.694 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[01811ee0-a1dc-4665-9e49-43f40f2bde18]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:08:50.695 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d4a37c9f-5b64-4f94-80e9-126c911b1acf.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d4a37c9f-5b64-4f94-80e9-126c911b1acf.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:08:50.695 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d4a37c9f-5b64-4f94-80e9-126c911b1acf.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d4a37c9f-5b64-4f94-80e9-126c911b1acf.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:08:50.695 104930 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for d4a37c9f-5b64-4f94-80e9-126c911b1acf disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:08:50.695 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d4a37c9f-5b64-4f94-80e9-126c911b1acf.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d4a37c9f-5b64-4f94-80e9-126c911b1acf.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:08:50.696 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[8603a46b-e6b2-4ad8-9cbd-c28c13106b68]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:08:50.696 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d4a37c9f-5b64-4f94-80e9-126c911b1acf.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d4a37c9f-5b64-4f94-80e9-126c911b1acf.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:08:50.697 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[92bcabff-76b2-4acf-b8a4-8e6b11acc514]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:08:50.697 104930 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]: global
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]:     log         /dev/log local0 debug
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]:     log-tag     haproxy-metadata-proxy-d4a37c9f-5b64-4f94-80e9-126c911b1acf
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]:     user        root
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]:     group       root
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]:     maxconn     1024
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]:     pidfile     /var/lib/neutron/external/pids/d4a37c9f-5b64-4f94-80e9-126c911b1acf.pid.haproxy
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]:     daemon
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]: 
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]: defaults
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]:     log global
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]:     mode http
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]:     option httplog
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]:     option dontlognull
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]:     option http-server-close
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]:     option forwardfor
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]:     retries                 3
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]:     timeout http-request    30s
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]:     timeout connect         30s
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]:     timeout client          32s
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]:     timeout server          32s
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]:     timeout http-keep-alive 30s
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]: 
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]: listen listener
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]:     bind 169.254.169.254:80
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]:     
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]:     server metadata /var/lib/neutron/metadata_proxy
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]: 
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]:     http-request add-header X-OVN-Network-ID d4a37c9f-5b64-4f94-80e9-126c911b1acf
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Jan 26 15:08:50 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:08:50.697 104930 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d4a37c9f-5b64-4f94-80e9-126c911b1acf', 'env', 'PROCESS_TAG=haproxy-d4a37c9f-5b64-4f94-80e9-126c911b1acf', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d4a37c9f-5b64-4f94-80e9-126c911b1acf.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Jan 26 15:08:51 compute-1 podman[205410]: 2026-01-26 15:08:51.103677713 +0000 UTC m=+0.063077196 container create feabdf5b0994c1edeabaecd875864b5ee430fe643f24b2538ab65140f11b1ab5 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d4a37c9f-5b64-4f94-80e9-126c911b1acf, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest)
Jan 26 15:08:51 compute-1 systemd[1]: Started libpod-conmon-feabdf5b0994c1edeabaecd875864b5ee430fe643f24b2538ab65140f11b1ab5.scope.
Jan 26 15:08:51 compute-1 podman[205410]: 2026-01-26 15:08:51.071670804 +0000 UTC m=+0.031070307 image pull d5bf96c5225682608353c2a38183b39c74c7c48343b54a579b3b6f3d81996637 38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Jan 26 15:08:51 compute-1 systemd[1]: Started libcrun container.
Jan 26 15:08:51 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/586c742fcf2df3da6069b7da1e64cb5522dd4fc218e9b835e719d2c137bf039e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 15:08:51 compute-1 podman[205410]: 2026-01-26 15:08:51.197675028 +0000 UTC m=+0.157074541 container init feabdf5b0994c1edeabaecd875864b5ee430fe643f24b2538ab65140f11b1ab5 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d4a37c9f-5b64-4f94-80e9-126c911b1acf, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 26 15:08:51 compute-1 podman[205410]: 2026-01-26 15:08:51.205471783 +0000 UTC m=+0.164871266 container start feabdf5b0994c1edeabaecd875864b5ee430fe643f24b2538ab65140f11b1ab5 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d4a37c9f-5b64-4f94-80e9-126c911b1acf, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 26 15:08:51 compute-1 neutron-haproxy-ovnmeta-d4a37c9f-5b64-4f94-80e9-126c911b1acf[205425]: [NOTICE]   (205429) : New worker (205431) forked
Jan 26 15:08:51 compute-1 neutron-haproxy-ovnmeta-d4a37c9f-5b64-4f94-80e9-126c911b1acf[205425]: [NOTICE]   (205429) : Loading success.
Jan 26 15:08:51 compute-1 nova_compute[183403]: 2026-01-26 15:08:51.793 183407 DEBUG nova.compute.manager [req-02a364be-8ba0-4848-85b4-bbcf16cd1523 req-6e787410-a9ce-4294-b57a-3f0d1fd1ad66 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Received event network-vif-plugged-1fd7a551-45a6-412c-abb2-e2d57c2b25e8 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:08:51 compute-1 nova_compute[183403]: 2026-01-26 15:08:51.794 183407 DEBUG oslo_concurrency.lockutils [req-02a364be-8ba0-4848-85b4-bbcf16cd1523 req-6e787410-a9ce-4294-b57a-3f0d1fd1ad66 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "66a7af21-1abe-467f-b739-441e05a4b09a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:08:51 compute-1 nova_compute[183403]: 2026-01-26 15:08:51.794 183407 DEBUG oslo_concurrency.lockutils [req-02a364be-8ba0-4848-85b4-bbcf16cd1523 req-6e787410-a9ce-4294-b57a-3f0d1fd1ad66 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "66a7af21-1abe-467f-b739-441e05a4b09a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:08:51 compute-1 nova_compute[183403]: 2026-01-26 15:08:51.794 183407 DEBUG oslo_concurrency.lockutils [req-02a364be-8ba0-4848-85b4-bbcf16cd1523 req-6e787410-a9ce-4294-b57a-3f0d1fd1ad66 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "66a7af21-1abe-467f-b739-441e05a4b09a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:08:51 compute-1 nova_compute[183403]: 2026-01-26 15:08:51.794 183407 DEBUG nova.compute.manager [req-02a364be-8ba0-4848-85b4-bbcf16cd1523 req-6e787410-a9ce-4294-b57a-3f0d1fd1ad66 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Processing event network-vif-plugged-1fd7a551-45a6-412c-abb2-e2d57c2b25e8 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 26 15:08:51 compute-1 nova_compute[183403]: 2026-01-26 15:08:51.795 183407 DEBUG nova.compute.manager [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 26 15:08:51 compute-1 nova_compute[183403]: 2026-01-26 15:08:51.802 183407 DEBUG nova.virt.libvirt.driver [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Jan 26 15:08:51 compute-1 nova_compute[183403]: 2026-01-26 15:08:51.807 183407 INFO nova.virt.libvirt.driver [-] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Instance spawned successfully.
Jan 26 15:08:51 compute-1 nova_compute[183403]: 2026-01-26 15:08:51.808 183407 DEBUG nova.virt.libvirt.driver [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Jan 26 15:08:52 compute-1 nova_compute[183403]: 2026-01-26 15:08:52.326 183407 DEBUG nova.virt.libvirt.driver [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:08:52 compute-1 nova_compute[183403]: 2026-01-26 15:08:52.327 183407 DEBUG nova.virt.libvirt.driver [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:08:52 compute-1 nova_compute[183403]: 2026-01-26 15:08:52.328 183407 DEBUG nova.virt.libvirt.driver [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:08:52 compute-1 nova_compute[183403]: 2026-01-26 15:08:52.328 183407 DEBUG nova.virt.libvirt.driver [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:08:52 compute-1 nova_compute[183403]: 2026-01-26 15:08:52.329 183407 DEBUG nova.virt.libvirt.driver [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:08:52 compute-1 nova_compute[183403]: 2026-01-26 15:08:52.329 183407 DEBUG nova.virt.libvirt.driver [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:08:52 compute-1 nova_compute[183403]: 2026-01-26 15:08:52.733 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:08:52 compute-1 nova_compute[183403]: 2026-01-26 15:08:52.839 183407 INFO nova.compute.manager [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Took 11.17 seconds to spawn the instance on the hypervisor.
Jan 26 15:08:52 compute-1 nova_compute[183403]: 2026-01-26 15:08:52.839 183407 DEBUG nova.compute.manager [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Jan 26 15:08:53 compute-1 nova_compute[183403]: 2026-01-26 15:08:53.368 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:08:53 compute-1 nova_compute[183403]: 2026-01-26 15:08:53.388 183407 INFO nova.compute.manager [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Took 16.38 seconds to build instance.
Jan 26 15:08:53 compute-1 nova_compute[183403]: 2026-01-26 15:08:53.839 183407 DEBUG nova.compute.manager [req-1a263d84-869e-4367-a392-c762cdd96fe8 req-3355688e-7fc6-41d6-b15b-1aaf33675d2d 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Received event network-vif-plugged-1fd7a551-45a6-412c-abb2-e2d57c2b25e8 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:08:53 compute-1 nova_compute[183403]: 2026-01-26 15:08:53.840 183407 DEBUG oslo_concurrency.lockutils [req-1a263d84-869e-4367-a392-c762cdd96fe8 req-3355688e-7fc6-41d6-b15b-1aaf33675d2d 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "66a7af21-1abe-467f-b739-441e05a4b09a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:08:53 compute-1 nova_compute[183403]: 2026-01-26 15:08:53.841 183407 DEBUG oslo_concurrency.lockutils [req-1a263d84-869e-4367-a392-c762cdd96fe8 req-3355688e-7fc6-41d6-b15b-1aaf33675d2d 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "66a7af21-1abe-467f-b739-441e05a4b09a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:08:53 compute-1 nova_compute[183403]: 2026-01-26 15:08:53.841 183407 DEBUG oslo_concurrency.lockutils [req-1a263d84-869e-4367-a392-c762cdd96fe8 req-3355688e-7fc6-41d6-b15b-1aaf33675d2d 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "66a7af21-1abe-467f-b739-441e05a4b09a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:08:53 compute-1 nova_compute[183403]: 2026-01-26 15:08:53.841 183407 DEBUG nova.compute.manager [req-1a263d84-869e-4367-a392-c762cdd96fe8 req-3355688e-7fc6-41d6-b15b-1aaf33675d2d 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] No waiting events found dispatching network-vif-plugged-1fd7a551-45a6-412c-abb2-e2d57c2b25e8 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:08:53 compute-1 nova_compute[183403]: 2026-01-26 15:08:53.842 183407 WARNING nova.compute.manager [req-1a263d84-869e-4367-a392-c762cdd96fe8 req-3355688e-7fc6-41d6-b15b-1aaf33675d2d 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Received unexpected event network-vif-plugged-1fd7a551-45a6-412c-abb2-e2d57c2b25e8 for instance with vm_state active and task_state None.
Jan 26 15:08:53 compute-1 nova_compute[183403]: 2026-01-26 15:08:53.895 183407 DEBUG oslo_concurrency.lockutils [None req-87ee8093-7ce3-41ee-8711-2498af0cd662 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "66a7af21-1abe-467f-b739-441e05a4b09a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.905s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:08:56 compute-1 podman[205442]: 2026-01-26 15:08:56.934867551 +0000 UTC m=+0.089456397 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260120, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 15:08:56 compute-1 podman[205441]: 2026-01-26 15:08:56.97510978 +0000 UTC m=+0.133881483 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 15:08:57 compute-1 nova_compute[183403]: 2026-01-26 15:08:57.737 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:08:58 compute-1 nova_compute[183403]: 2026-01-26 15:08:58.372 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:08:58 compute-1 nova_compute[183403]: 2026-01-26 15:08:58.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:08:58 compute-1 nova_compute[183403]: 2026-01-26 15:08:58.577 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:08:58 compute-1 nova_compute[183403]: 2026-01-26 15:08:58.577 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:08:59 compute-1 nova_compute[183403]: 2026-01-26 15:08:59.091 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:08:59 compute-1 nova_compute[183403]: 2026-01-26 15:08:59.092 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:08:59 compute-1 nova_compute[183403]: 2026-01-26 15:08:59.092 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:08:59 compute-1 nova_compute[183403]: 2026-01-26 15:08:59.092 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:09:00 compute-1 nova_compute[183403]: 2026-01-26 15:09:00.138 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/66a7af21-1abe-467f-b739-441e05a4b09a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:09:00 compute-1 nova_compute[183403]: 2026-01-26 15:09:00.199 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/66a7af21-1abe-467f-b739-441e05a4b09a/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:09:00 compute-1 nova_compute[183403]: 2026-01-26 15:09:00.200 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/66a7af21-1abe-467f-b739-441e05a4b09a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:09:00 compute-1 nova_compute[183403]: 2026-01-26 15:09:00.315 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/66a7af21-1abe-467f-b739-441e05a4b09a/disk --force-share --output=json" returned: 0 in 0.116s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:09:00 compute-1 nova_compute[183403]: 2026-01-26 15:09:00.446 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:09:00 compute-1 nova_compute[183403]: 2026-01-26 15:09:00.448 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:09:00 compute-1 nova_compute[183403]: 2026-01-26 15:09:00.478 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.030s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:09:00 compute-1 nova_compute[183403]: 2026-01-26 15:09:00.479 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5694MB free_disk=73.14832305908203GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:09:00 compute-1 nova_compute[183403]: 2026-01-26 15:09:00.479 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:09:00 compute-1 nova_compute[183403]: 2026-01-26 15:09:00.479 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:09:01 compute-1 nova_compute[183403]: 2026-01-26 15:09:01.540 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Instance 66a7af21-1abe-467f-b739-441e05a4b09a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 15:09:01 compute-1 nova_compute[183403]: 2026-01-26 15:09:01.541 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:09:01 compute-1 nova_compute[183403]: 2026-01-26 15:09:01.541 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:09:00 up  1:04,  0 user,  load average: 0.19, 0.17, 0.37\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_6377892a338d4a7cbe63cf30bd2c63ea': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:09:01 compute-1 nova_compute[183403]: 2026-01-26 15:09:01.588 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:09:02 compute-1 nova_compute[183403]: 2026-01-26 15:09:02.094 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:09:02 compute-1 nova_compute[183403]: 2026-01-26 15:09:02.603 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:09:02 compute-1 nova_compute[183403]: 2026-01-26 15:09:02.604 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.125s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:09:02 compute-1 nova_compute[183403]: 2026-01-26 15:09:02.740 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:09:03 compute-1 nova_compute[183403]: 2026-01-26 15:09:03.374 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:09:04 compute-1 ovn_controller[95641]: 2026-01-26T15:09:04Z|00003|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:da:59:eb 10.100.0.9
Jan 26 15:09:04 compute-1 ovn_controller[95641]: 2026-01-26T15:09:04Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:da:59:eb 10.100.0.9
Jan 26 15:09:04 compute-1 nova_compute[183403]: 2026-01-26 15:09:04.604 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:09:04 compute-1 nova_compute[183403]: 2026-01-26 15:09:04.605 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:09:04 compute-1 nova_compute[183403]: 2026-01-26 15:09:04.605 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:09:04 compute-1 nova_compute[183403]: 2026-01-26 15:09:04.605 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 15:09:05 compute-1 nova_compute[183403]: 2026-01-26 15:09:05.528 183407 DEBUG nova.compute.manager [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8c64a2e0-f723-4adb-84fc-867073a92349] Stashing vm_state: active _prep_resize /usr/lib/python3.12/site-packages/nova/compute/manager.py:6173
Jan 26 15:09:05 compute-1 nova_compute[183403]: 2026-01-26 15:09:05.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:09:05 compute-1 podman[192725]: time="2026-01-26T15:09:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:09:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:09:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16574 "" "Go-http-client/1.1"
Jan 26 15:09:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:09:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2635 "" "Go-http-client/1.1"
Jan 26 15:09:06 compute-1 nova_compute[183403]: 2026-01-26 15:09:06.060 183407 DEBUG oslo_concurrency.lockutils [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:09:06 compute-1 nova_compute[183403]: 2026-01-26 15:09:06.061 183407 DEBUG oslo_concurrency.lockutils [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:09:06 compute-1 nova_compute[183403]: 2026-01-26 15:09:06.576 183407 DEBUG nova.virt.hardware [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Jan 26 15:09:06 compute-1 nova_compute[183403]: 2026-01-26 15:09:06.576 183407 INFO nova.compute.claims [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8c64a2e0-f723-4adb-84fc-867073a92349] Claim successful on node compute-1.ctlplane.example.com
Jan 26 15:09:07 compute-1 nova_compute[183403]: 2026-01-26 15:09:07.091 183407 INFO nova.compute.resource_tracker [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8c64a2e0-f723-4adb-84fc-867073a92349] Updating resource usage from migration e5ee547a-77b7-44f0-9d5e-1f2d4a0e8050
Jan 26 15:09:07 compute-1 nova_compute[183403]: 2026-01-26 15:09:07.091 183407 DEBUG nova.compute.resource_tracker [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8c64a2e0-f723-4adb-84fc-867073a92349] Starting to track incoming migration e5ee547a-77b7-44f0-9d5e-1f2d4a0e8050 with flavor b6884b57-cf1a-443f-a7b9-2aea263b07fa _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Jan 26 15:09:07 compute-1 nova_compute[183403]: 2026-01-26 15:09:07.677 183407 DEBUG nova.compute.provider_tree [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:09:07 compute-1 nova_compute[183403]: 2026-01-26 15:09:07.743 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:09:08 compute-1 nova_compute[183403]: 2026-01-26 15:09:08.185 183407 DEBUG nova.scheduler.client.report [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:09:08 compute-1 nova_compute[183403]: 2026-01-26 15:09:08.377 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:09:08 compute-1 nova_compute[183403]: 2026-01-26 15:09:08.572 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:09:08 compute-1 nova_compute[183403]: 2026-01-26 15:09:08.695 183407 DEBUG oslo_concurrency.lockutils [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 2.634s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:09:08 compute-1 nova_compute[183403]: 2026-01-26 15:09:08.696 183407 INFO nova.compute.manager [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8c64a2e0-f723-4adb-84fc-867073a92349] Migrating
Jan 26 15:09:08 compute-1 nova_compute[183403]: 2026-01-26 15:09:08.696 183407 DEBUG oslo_concurrency.lockutils [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 15:09:08 compute-1 nova_compute[183403]: 2026-01-26 15:09:08.696 183407 DEBUG oslo_concurrency.lockutils [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 15:09:09 compute-1 nova_compute[183403]: 2026-01-26 15:09:09.202 183407 INFO nova.compute.rpcapi [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Automatically selected compute RPC version 6.4 from minimum service version 70
Jan 26 15:09:09 compute-1 nova_compute[183403]: 2026-01-26 15:09:09.204 183407 DEBUG oslo_concurrency.lockutils [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 15:09:10 compute-1 sshd-session[205440]: Disconnecting authenticating user qemu 185.246.128.170 port 56602: Change of username or service not allowed: (qemu,ssh-connection) -> (teste,ssh-connection) [preauth]
Jan 26 15:09:12 compute-1 nova_compute[183403]: 2026-01-26 15:09:12.745 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:09:13 compute-1 nova_compute[183403]: 2026-01-26 15:09:13.379 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:09:14 compute-1 sshd-session[205511]: Accepted publickey for nova from 192.168.122.100 port 41802 ssh2: ECDSA SHA256:T5sQaZjFwfSMOu21b+lyPX98YYE4+kdOeN+QFPxwhQE
Jan 26 15:09:14 compute-1 systemd[1]: Created slice User Slice of UID 42436.
Jan 26 15:09:14 compute-1 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 26 15:09:14 compute-1 systemd-logind[795]: New session 28 of user nova.
Jan 26 15:09:14 compute-1 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 26 15:09:14 compute-1 systemd[1]: Starting User Manager for UID 42436...
Jan 26 15:09:14 compute-1 systemd[205515]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 26 15:09:14 compute-1 systemd[205515]: Queued start job for default target Main User Target.
Jan 26 15:09:14 compute-1 systemd[205515]: Created slice User Application Slice.
Jan 26 15:09:14 compute-1 systemd[205515]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 26 15:09:14 compute-1 systemd[205515]: Started Daily Cleanup of User's Temporary Directories.
Jan 26 15:09:14 compute-1 systemd[205515]: Reached target Paths.
Jan 26 15:09:14 compute-1 systemd[205515]: Reached target Timers.
Jan 26 15:09:14 compute-1 systemd[205515]: Starting D-Bus User Message Bus Socket...
Jan 26 15:09:14 compute-1 systemd[205515]: Starting Create User's Volatile Files and Directories...
Jan 26 15:09:14 compute-1 systemd[205515]: Finished Create User's Volatile Files and Directories.
Jan 26 15:09:14 compute-1 systemd[205515]: Listening on D-Bus User Message Bus Socket.
Jan 26 15:09:14 compute-1 systemd[205515]: Reached target Sockets.
Jan 26 15:09:14 compute-1 systemd[205515]: Reached target Basic System.
Jan 26 15:09:14 compute-1 systemd[205515]: Reached target Main User Target.
Jan 26 15:09:14 compute-1 systemd[205515]: Startup finished in 132ms.
Jan 26 15:09:14 compute-1 systemd[1]: Started User Manager for UID 42436.
Jan 26 15:09:14 compute-1 systemd[1]: Started Session 28 of User nova.
Jan 26 15:09:14 compute-1 sshd-session[205511]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 26 15:09:14 compute-1 sshd-session[205530]: Received disconnect from 192.168.122.100 port 41802:11: disconnected by user
Jan 26 15:09:14 compute-1 sshd-session[205530]: Disconnected from user nova 192.168.122.100 port 41802
Jan 26 15:09:14 compute-1 sshd-session[205511]: pam_unix(sshd:session): session closed for user nova
Jan 26 15:09:14 compute-1 systemd[1]: session-28.scope: Deactivated successfully.
Jan 26 15:09:14 compute-1 systemd-logind[795]: Session 28 logged out. Waiting for processes to exit.
Jan 26 15:09:14 compute-1 systemd-logind[795]: Removed session 28.
Jan 26 15:09:14 compute-1 sshd-session[205532]: Accepted publickey for nova from 192.168.122.100 port 41818 ssh2: ECDSA SHA256:T5sQaZjFwfSMOu21b+lyPX98YYE4+kdOeN+QFPxwhQE
Jan 26 15:09:14 compute-1 systemd-logind[795]: New session 30 of user nova.
Jan 26 15:09:14 compute-1 systemd[1]: Started Session 30 of User nova.
Jan 26 15:09:14 compute-1 sshd-session[205532]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 26 15:09:14 compute-1 sshd-session[205535]: Received disconnect from 192.168.122.100 port 41818:11: disconnected by user
Jan 26 15:09:14 compute-1 sshd-session[205535]: Disconnected from user nova 192.168.122.100 port 41818
Jan 26 15:09:14 compute-1 sshd-session[205532]: pam_unix(sshd:session): session closed for user nova
Jan 26 15:09:14 compute-1 systemd[1]: session-30.scope: Deactivated successfully.
Jan 26 15:09:14 compute-1 systemd-logind[795]: Session 30 logged out. Waiting for processes to exit.
Jan 26 15:09:14 compute-1 systemd-logind[795]: Removed session 30.
Jan 26 15:09:15 compute-1 podman[205537]: 2026-01-26 15:09:15.907551047 +0000 UTC m=+0.069540784 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 26 15:09:15 compute-1 podman[205538]: 2026-01-26 15:09:15.908608143 +0000 UTC m=+0.076539430 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, distribution-scope=public, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.openshift.expose-services=, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, architecture=x86_64)
Jan 26 15:09:17 compute-1 nova_compute[183403]: 2026-01-26 15:09:17.338 183407 DEBUG nova.compute.manager [req-66123e55-ce81-407d-9900-64568d3c6c46 req-e5a70a1f-8e37-4a33-9b41-6896f379f262 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8c64a2e0-f723-4adb-84fc-867073a92349] Received event network-vif-unplugged-6dd62b2f-1957-4fa5-92d8-6a7d131f0d09 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:09:17 compute-1 nova_compute[183403]: 2026-01-26 15:09:17.338 183407 DEBUG oslo_concurrency.lockutils [req-66123e55-ce81-407d-9900-64568d3c6c46 req-e5a70a1f-8e37-4a33-9b41-6896f379f262 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "8c64a2e0-f723-4adb-84fc-867073a92349-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:09:17 compute-1 nova_compute[183403]: 2026-01-26 15:09:17.339 183407 DEBUG oslo_concurrency.lockutils [req-66123e55-ce81-407d-9900-64568d3c6c46 req-e5a70a1f-8e37-4a33-9b41-6896f379f262 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "8c64a2e0-f723-4adb-84fc-867073a92349-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:09:17 compute-1 nova_compute[183403]: 2026-01-26 15:09:17.339 183407 DEBUG oslo_concurrency.lockutils [req-66123e55-ce81-407d-9900-64568d3c6c46 req-e5a70a1f-8e37-4a33-9b41-6896f379f262 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "8c64a2e0-f723-4adb-84fc-867073a92349-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:09:17 compute-1 nova_compute[183403]: 2026-01-26 15:09:17.339 183407 DEBUG nova.compute.manager [req-66123e55-ce81-407d-9900-64568d3c6c46 req-e5a70a1f-8e37-4a33-9b41-6896f379f262 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8c64a2e0-f723-4adb-84fc-867073a92349] No waiting events found dispatching network-vif-unplugged-6dd62b2f-1957-4fa5-92d8-6a7d131f0d09 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:09:17 compute-1 nova_compute[183403]: 2026-01-26 15:09:17.339 183407 WARNING nova.compute.manager [req-66123e55-ce81-407d-9900-64568d3c6c46 req-e5a70a1f-8e37-4a33-9b41-6896f379f262 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8c64a2e0-f723-4adb-84fc-867073a92349] Received unexpected event network-vif-unplugged-6dd62b2f-1957-4fa5-92d8-6a7d131f0d09 for instance with vm_state active and task_state resize_migrating.
Jan 26 15:09:17 compute-1 sshd-session[205509]: Invalid user teste from 185.246.128.170 port 37200
Jan 26 15:09:17 compute-1 nova_compute[183403]: 2026-01-26 15:09:17.749 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:09:17 compute-1 sshd-session[205581]: Accepted publickey for nova from 192.168.122.100 port 43088 ssh2: ECDSA SHA256:T5sQaZjFwfSMOu21b+lyPX98YYE4+kdOeN+QFPxwhQE
Jan 26 15:09:17 compute-1 systemd-logind[795]: New session 31 of user nova.
Jan 26 15:09:17 compute-1 systemd[1]: Started Session 31 of User nova.
Jan 26 15:09:17 compute-1 sshd-session[205581]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 26 15:09:18 compute-1 sshd-session[205584]: Received disconnect from 192.168.122.100 port 43088:11: disconnected by user
Jan 26 15:09:18 compute-1 sshd-session[205584]: Disconnected from user nova 192.168.122.100 port 43088
Jan 26 15:09:18 compute-1 nova_compute[183403]: 2026-01-26 15:09:18.383 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:09:18 compute-1 sshd-session[205581]: pam_unix(sshd:session): session closed for user nova
Jan 26 15:09:18 compute-1 systemd[1]: session-31.scope: Deactivated successfully.
Jan 26 15:09:18 compute-1 systemd-logind[795]: Session 31 logged out. Waiting for processes to exit.
Jan 26 15:09:18 compute-1 systemd-logind[795]: Removed session 31.
Jan 26 15:09:18 compute-1 sshd-session[205586]: Accepted publickey for nova from 192.168.122.100 port 43092 ssh2: ECDSA SHA256:T5sQaZjFwfSMOu21b+lyPX98YYE4+kdOeN+QFPxwhQE
Jan 26 15:09:18 compute-1 systemd-logind[795]: New session 32 of user nova.
Jan 26 15:09:18 compute-1 systemd[1]: Started Session 32 of User nova.
Jan 26 15:09:18 compute-1 sshd-session[205586]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 26 15:09:18 compute-1 sshd-session[205589]: Received disconnect from 192.168.122.100 port 43092:11: disconnected by user
Jan 26 15:09:18 compute-1 sshd-session[205589]: Disconnected from user nova 192.168.122.100 port 43092
Jan 26 15:09:18 compute-1 sshd-session[205586]: pam_unix(sshd:session): session closed for user nova
Jan 26 15:09:18 compute-1 systemd[1]: session-32.scope: Deactivated successfully.
Jan 26 15:09:18 compute-1 systemd-logind[795]: Session 32 logged out. Waiting for processes to exit.
Jan 26 15:09:18 compute-1 systemd-logind[795]: Removed session 32.
Jan 26 15:09:18 compute-1 sshd-session[205591]: Accepted publickey for nova from 192.168.122.100 port 43094 ssh2: ECDSA SHA256:T5sQaZjFwfSMOu21b+lyPX98YYE4+kdOeN+QFPxwhQE
Jan 26 15:09:18 compute-1 systemd-logind[795]: New session 33 of user nova.
Jan 26 15:09:18 compute-1 systemd[1]: Started Session 33 of User nova.
Jan 26 15:09:18 compute-1 sshd-session[205591]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 26 15:09:18 compute-1 sshd-session[205594]: Received disconnect from 192.168.122.100 port 43094:11: disconnected by user
Jan 26 15:09:18 compute-1 sshd-session[205594]: Disconnected from user nova 192.168.122.100 port 43094
Jan 26 15:09:18 compute-1 sshd-session[205591]: pam_unix(sshd:session): session closed for user nova
Jan 26 15:09:18 compute-1 systemd[1]: session-33.scope: Deactivated successfully.
Jan 26 15:09:18 compute-1 systemd-logind[795]: Session 33 logged out. Waiting for processes to exit.
Jan 26 15:09:18 compute-1 systemd-logind[795]: Removed session 33.
Jan 26 15:09:19 compute-1 nova_compute[183403]: 2026-01-26 15:09:19.412 183407 DEBUG nova.compute.manager [req-aeb8b368-2ff9-45ea-91da-3df384237531 req-6baeaacc-9327-4b49-b3cd-09dfdaa76590 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8c64a2e0-f723-4adb-84fc-867073a92349] Received event network-vif-unplugged-6dd62b2f-1957-4fa5-92d8-6a7d131f0d09 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:09:19 compute-1 nova_compute[183403]: 2026-01-26 15:09:19.412 183407 DEBUG oslo_concurrency.lockutils [req-aeb8b368-2ff9-45ea-91da-3df384237531 req-6baeaacc-9327-4b49-b3cd-09dfdaa76590 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "8c64a2e0-f723-4adb-84fc-867073a92349-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:09:19 compute-1 nova_compute[183403]: 2026-01-26 15:09:19.413 183407 DEBUG oslo_concurrency.lockutils [req-aeb8b368-2ff9-45ea-91da-3df384237531 req-6baeaacc-9327-4b49-b3cd-09dfdaa76590 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "8c64a2e0-f723-4adb-84fc-867073a92349-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:09:19 compute-1 nova_compute[183403]: 2026-01-26 15:09:19.413 183407 DEBUG oslo_concurrency.lockutils [req-aeb8b368-2ff9-45ea-91da-3df384237531 req-6baeaacc-9327-4b49-b3cd-09dfdaa76590 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "8c64a2e0-f723-4adb-84fc-867073a92349-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:09:19 compute-1 nova_compute[183403]: 2026-01-26 15:09:19.413 183407 DEBUG nova.compute.manager [req-aeb8b368-2ff9-45ea-91da-3df384237531 req-6baeaacc-9327-4b49-b3cd-09dfdaa76590 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8c64a2e0-f723-4adb-84fc-867073a92349] No waiting events found dispatching network-vif-unplugged-6dd62b2f-1957-4fa5-92d8-6a7d131f0d09 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:09:19 compute-1 nova_compute[183403]: 2026-01-26 15:09:19.413 183407 WARNING nova.compute.manager [req-aeb8b368-2ff9-45ea-91da-3df384237531 req-6baeaacc-9327-4b49-b3cd-09dfdaa76590 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8c64a2e0-f723-4adb-84fc-867073a92349] Received unexpected event network-vif-unplugged-6dd62b2f-1957-4fa5-92d8-6a7d131f0d09 for instance with vm_state active and task_state resize_migrating.
Jan 26 15:09:19 compute-1 openstack_network_exporter[195610]: ERROR   15:09:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:09:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:09:19 compute-1 openstack_network_exporter[195610]: ERROR   15:09:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:09:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:09:21 compute-1 sshd-session[205509]: Disconnecting invalid user teste 185.246.128.170 port 37200: Change of username or service not allowed: (teste,ssh-connection) -> (backup,ssh-connection) [preauth]
Jan 26 15:09:21 compute-1 nova_compute[183403]: 2026-01-26 15:09:21.652 183407 WARNING neutronclient.v2_0.client [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:09:21 compute-1 nova_compute[183403]: 2026-01-26 15:09:21.961 183407 INFO nova.network.neutron [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8c64a2e0-f723-4adb-84fc-867073a92349] Updating port 6dd62b2f-1957-4fa5-92d8-6a7d131f0d09 with attributes {'binding:host_id': 'compute-1.ctlplane.example.com', 'device_owner': 'compute:nova'}
Jan 26 15:09:22 compute-1 nova_compute[183403]: 2026-01-26 15:09:22.502 183407 DEBUG oslo_concurrency.lockutils [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "refresh_cache-8c64a2e0-f723-4adb-84fc-867073a92349" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 15:09:22 compute-1 nova_compute[183403]: 2026-01-26 15:09:22.502 183407 DEBUG oslo_concurrency.lockutils [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquired lock "refresh_cache-8c64a2e0-f723-4adb-84fc-867073a92349" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 15:09:22 compute-1 nova_compute[183403]: 2026-01-26 15:09:22.502 183407 DEBUG nova.network.neutron [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8c64a2e0-f723-4adb-84fc-867073a92349] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 15:09:22 compute-1 nova_compute[183403]: 2026-01-26 15:09:22.539 183407 DEBUG nova.compute.manager [req-433e2b65-8c0a-4eb0-a843-b3eac53c2db8 req-d9ba49c6-d641-4dd9-a4c2-d8a6c3355704 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8c64a2e0-f723-4adb-84fc-867073a92349] Received event network-changed-6dd62b2f-1957-4fa5-92d8-6a7d131f0d09 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:09:22 compute-1 nova_compute[183403]: 2026-01-26 15:09:22.540 183407 DEBUG nova.compute.manager [req-433e2b65-8c0a-4eb0-a843-b3eac53c2db8 req-d9ba49c6-d641-4dd9-a4c2-d8a6c3355704 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8c64a2e0-f723-4adb-84fc-867073a92349] Refreshing instance network info cache due to event network-changed-6dd62b2f-1957-4fa5-92d8-6a7d131f0d09. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 26 15:09:22 compute-1 nova_compute[183403]: 2026-01-26 15:09:22.540 183407 DEBUG oslo_concurrency.lockutils [req-433e2b65-8c0a-4eb0-a843-b3eac53c2db8 req-d9ba49c6-d641-4dd9-a4c2-d8a6c3355704 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "refresh_cache-8c64a2e0-f723-4adb-84fc-867073a92349" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 15:09:22 compute-1 nova_compute[183403]: 2026-01-26 15:09:22.753 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:09:23 compute-1 nova_compute[183403]: 2026-01-26 15:09:23.009 183407 WARNING neutronclient.v2_0.client [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:09:23 compute-1 nova_compute[183403]: 2026-01-26 15:09:23.384 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:09:26 compute-1 nova_compute[183403]: 2026-01-26 15:09:26.631 183407 WARNING neutronclient.v2_0.client [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:09:26 compute-1 nova_compute[183403]: 2026-01-26 15:09:26.836 183407 DEBUG nova.network.neutron [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8c64a2e0-f723-4adb-84fc-867073a92349] Updating instance_info_cache with network_info: [{"id": "6dd62b2f-1957-4fa5-92d8-6a7d131f0d09", "address": "fa:16:3e:91:f3:28", "network": {"id": "d4a37c9f-5b64-4f94-80e9-126c911b1acf", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7dc5e9070a084dfcb543a08e87868f39", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6dd62b2f-19", "ovs_interfaceid": "6dd62b2f-1957-4fa5-92d8-6a7d131f0d09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:09:27 compute-1 nova_compute[183403]: 2026-01-26 15:09:27.342 183407 DEBUG oslo_concurrency.lockutils [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Releasing lock "refresh_cache-8c64a2e0-f723-4adb-84fc-867073a92349" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 15:09:27 compute-1 nova_compute[183403]: 2026-01-26 15:09:27.346 183407 DEBUG oslo_concurrency.lockutils [req-433e2b65-8c0a-4eb0-a843-b3eac53c2db8 req-d9ba49c6-d641-4dd9-a4c2-d8a6c3355704 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquired lock "refresh_cache-8c64a2e0-f723-4adb-84fc-867073a92349" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 15:09:27 compute-1 nova_compute[183403]: 2026-01-26 15:09:27.347 183407 DEBUG nova.network.neutron [req-433e2b65-8c0a-4eb0-a843-b3eac53c2db8 req-d9ba49c6-d641-4dd9-a4c2-d8a6c3355704 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8c64a2e0-f723-4adb-84fc-867073a92349] Refreshing network info cache for port 6dd62b2f-1957-4fa5-92d8-6a7d131f0d09 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 26 15:09:27 compute-1 nova_compute[183403]: 2026-01-26 15:09:27.757 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:09:27 compute-1 nova_compute[183403]: 2026-01-26 15:09:27.857 183407 WARNING neutronclient.v2_0.client [req-433e2b65-8c0a-4eb0-a843-b3eac53c2db8 req-d9ba49c6-d641-4dd9-a4c2-d8a6c3355704 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:09:27 compute-1 podman[205599]: 2026-01-26 15:09:27.883252414 +0000 UTC m=+0.051873560 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Jan 26 15:09:27 compute-1 nova_compute[183403]: 2026-01-26 15:09:27.895 183407 DEBUG nova.virt.libvirt.driver [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8c64a2e0-f723-4adb-84fc-867073a92349] Starting finish_migration finish_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12604
Jan 26 15:09:27 compute-1 nova_compute[183403]: 2026-01-26 15:09:27.897 183407 DEBUG nova.virt.libvirt.driver [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8c64a2e0-f723-4adb-84fc-867073a92349] Instance directory exists: not creating _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5134
Jan 26 15:09:27 compute-1 nova_compute[183403]: 2026-01-26 15:09:27.898 183407 INFO nova.virt.libvirt.driver [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8c64a2e0-f723-4adb-84fc-867073a92349] Creating image(s)
Jan 26 15:09:27 compute-1 nova_compute[183403]: 2026-01-26 15:09:27.899 183407 DEBUG oslo_concurrency.processutils [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:09:27 compute-1 nova_compute[183403]: 2026-01-26 15:09:27.978 183407 DEBUG oslo_concurrency.processutils [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:09:27 compute-1 nova_compute[183403]: 2026-01-26 15:09:27.979 183407 DEBUG nova.virt.disk.api [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Checking if we can resize image /var/lib/nova/instances/8c64a2e0-f723-4adb-84fc-867073a92349/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 26 15:09:27 compute-1 nova_compute[183403]: 2026-01-26 15:09:27.979 183407 DEBUG oslo_concurrency.processutils [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8c64a2e0-f723-4adb-84fc-867073a92349/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:09:28 compute-1 podman[205598]: 2026-01-26 15:09:28.0226357 +0000 UTC m=+0.196477621 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true, org.label-schema.build-date=20260120)
Jan 26 15:09:28 compute-1 nova_compute[183403]: 2026-01-26 15:09:28.051 183407 DEBUG oslo_concurrency.processutils [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8c64a2e0-f723-4adb-84fc-867073a92349/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:09:28 compute-1 nova_compute[183403]: 2026-01-26 15:09:28.052 183407 DEBUG nova.virt.disk.api [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Cannot resize image /var/lib/nova/instances/8c64a2e0-f723-4adb-84fc-867073a92349/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 26 15:09:28 compute-1 nova_compute[183403]: 2026-01-26 15:09:28.378 183407 WARNING neutronclient.v2_0.client [req-433e2b65-8c0a-4eb0-a843-b3eac53c2db8 req-d9ba49c6-d641-4dd9-a4c2-d8a6c3355704 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:09:28 compute-1 nova_compute[183403]: 2026-01-26 15:09:28.385 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:09:28 compute-1 nova_compute[183403]: 2026-01-26 15:09:28.548 183407 DEBUG nova.network.neutron [req-433e2b65-8c0a-4eb0-a843-b3eac53c2db8 req-d9ba49c6-d641-4dd9-a4c2-d8a6c3355704 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8c64a2e0-f723-4adb-84fc-867073a92349] Updated VIF entry in instance network info cache for port 6dd62b2f-1957-4fa5-92d8-6a7d131f0d09. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Jan 26 15:09:28 compute-1 nova_compute[183403]: 2026-01-26 15:09:28.549 183407 DEBUG nova.network.neutron [req-433e2b65-8c0a-4eb0-a843-b3eac53c2db8 req-d9ba49c6-d641-4dd9-a4c2-d8a6c3355704 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8c64a2e0-f723-4adb-84fc-867073a92349] Updating instance_info_cache with network_info: [{"id": "6dd62b2f-1957-4fa5-92d8-6a7d131f0d09", "address": "fa:16:3e:91:f3:28", "network": {"id": "d4a37c9f-5b64-4f94-80e9-126c911b1acf", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7dc5e9070a084dfcb543a08e87868f39", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6dd62b2f-19", "ovs_interfaceid": "6dd62b2f-1957-4fa5-92d8-6a7d131f0d09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:09:28 compute-1 nova_compute[183403]: 2026-01-26 15:09:28.562 183407 DEBUG nova.virt.libvirt.driver [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8c64a2e0-f723-4adb-84fc-867073a92349] Did not create local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5272
Jan 26 15:09:28 compute-1 nova_compute[183403]: 2026-01-26 15:09:28.562 183407 DEBUG nova.virt.libvirt.driver [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8c64a2e0-f723-4adb-84fc-867073a92349] Ensure instance console log exists: /var/lib/nova/instances/8c64a2e0-f723-4adb-84fc-867073a92349/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Jan 26 15:09:28 compute-1 nova_compute[183403]: 2026-01-26 15:09:28.563 183407 DEBUG oslo_concurrency.lockutils [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:09:28 compute-1 nova_compute[183403]: 2026-01-26 15:09:28.563 183407 DEBUG oslo_concurrency.lockutils [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:09:28 compute-1 nova_compute[183403]: 2026-01-26 15:09:28.563 183407 DEBUG oslo_concurrency.lockutils [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:09:28 compute-1 nova_compute[183403]: 2026-01-26 15:09:28.566 183407 DEBUG nova.virt.libvirt.driver [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8c64a2e0-f723-4adb-84fc-867073a92349] Start _get_guest_xml network_info=[{"id": "6dd62b2f-1957-4fa5-92d8-6a7d131f0d09", "address": "fa:16:3e:91:f3:28", "network": {"id": "d4a37c9f-5b64-4f94-80e9-126c911b1acf", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "vif_mac": "fa:16:3e:91:f3:28"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7dc5e9070a084dfcb543a08e87868f39", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6dd62b2f-19", "ovs_interfaceid": "6dd62b2f-1957-4fa5-92d8-6a7d131f0d09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T15:01:22Z,direct_url=<?>,disk_format='qcow2',id=354e4d0e-4287-404f-93d3-2c85cfe92fbc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='179f3c996d8f4e7ea1b0aca3ec76f02e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T15:01:23Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'image_id': '354e4d0e-4287-404f-93d3-2c85cfe92fbc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Jan 26 15:09:28 compute-1 nova_compute[183403]: 2026-01-26 15:09:28.571 183407 WARNING nova.virt.libvirt.driver [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:09:28 compute-1 nova_compute[183403]: 2026-01-26 15:09:28.572 183407 DEBUG nova.virt.driver [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='354e4d0e-4287-404f-93d3-2c85cfe92fbc', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteActionsViaActuator-server-737487078', uuid='8c64a2e0-f723-4adb-84fc-867073a92349'), owner=OwnerMeta(userid='afb4f4811cb043dca89a8413c390ba3d', username='tempest-TestExecuteActionsViaActuator-280856547-project-admin', projectid='6377892a338d4a7cbe63cf30bd2c63ea', projectname='tempest-TestExecuteActionsViaActuator-280856547'), image=ImageMeta(id='354e4d0e-4287-404f-93d3-2c85cfe92fbc', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_cdrom_bus': 'sata', 'hw_disk_bus': 'virtio', 'hw_input_bus': 'usb', 'hw_machine_type': 'q35', 'hw_pointer_model': 'usbtablet', 'hw_rng_model': 'virtio', 'hw_video_model': 'virtio', 'hw_vif_model': 'virtio'}), flavor=FlavorMeta(name='m1.micro', flavorid='b6884b57-cf1a-443f-a7b9-2aea263b07fa', memory_mb=192, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "6dd62b2f-1957-4fa5-92d8-6a7d131f0d09", "address": "fa:16:3e:91:f3:28", "network": {"id": "d4a37c9f-5b64-4f94-80e9-126c911b1acf", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "vif_mac": "fa:16:3e:91:f3:28"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7dc5e9070a084dfcb543a08e87868f39", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6dd62b2f-19", "ovs_interfaceid": "6dd62b2f-1957-4fa5-92d8-6a7d131f0d09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1769440168.5728986) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Jan 26 15:09:28 compute-1 nova_compute[183403]: 2026-01-26 15:09:28.579 183407 DEBUG nova.virt.libvirt.host [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Jan 26 15:09:28 compute-1 nova_compute[183403]: 2026-01-26 15:09:28.580 183407 DEBUG nova.virt.libvirt.host [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Jan 26 15:09:28 compute-1 nova_compute[183403]: 2026-01-26 15:09:28.583 183407 DEBUG nova.virt.libvirt.host [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Jan 26 15:09:28 compute-1 nova_compute[183403]: 2026-01-26 15:09:28.584 183407 DEBUG nova.virt.libvirt.host [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Jan 26 15:09:28 compute-1 nova_compute[183403]: 2026-01-26 15:09:28.585 183407 DEBUG nova.virt.libvirt.driver [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Jan 26 15:09:28 compute-1 nova_compute[183403]: 2026-01-26 15:09:28.585 183407 DEBUG nova.virt.hardware [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T15:01:21Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b6884b57-cf1a-443f-a7b9-2aea263b07fa',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T15:01:22Z,direct_url=<?>,disk_format='qcow2',id=354e4d0e-4287-404f-93d3-2c85cfe92fbc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='179f3c996d8f4e7ea1b0aca3ec76f02e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T15:01:23Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Jan 26 15:09:28 compute-1 nova_compute[183403]: 2026-01-26 15:09:28.585 183407 DEBUG nova.virt.hardware [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Jan 26 15:09:28 compute-1 nova_compute[183403]: 2026-01-26 15:09:28.586 183407 DEBUG nova.virt.hardware [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Jan 26 15:09:28 compute-1 nova_compute[183403]: 2026-01-26 15:09:28.586 183407 DEBUG nova.virt.hardware [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Jan 26 15:09:28 compute-1 nova_compute[183403]: 2026-01-26 15:09:28.586 183407 DEBUG nova.virt.hardware [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Jan 26 15:09:28 compute-1 nova_compute[183403]: 2026-01-26 15:09:28.586 183407 DEBUG nova.virt.hardware [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Jan 26 15:09:28 compute-1 nova_compute[183403]: 2026-01-26 15:09:28.586 183407 DEBUG nova.virt.hardware [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Jan 26 15:09:28 compute-1 nova_compute[183403]: 2026-01-26 15:09:28.586 183407 DEBUG nova.virt.hardware [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Jan 26 15:09:28 compute-1 nova_compute[183403]: 2026-01-26 15:09:28.586 183407 DEBUG nova.virt.hardware [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Jan 26 15:09:28 compute-1 nova_compute[183403]: 2026-01-26 15:09:28.587 183407 DEBUG nova.virt.hardware [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Jan 26 15:09:28 compute-1 nova_compute[183403]: 2026-01-26 15:09:28.587 183407 DEBUG nova.virt.hardware [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Jan 26 15:09:28 compute-1 nova_compute[183403]: 2026-01-26 15:09:28.590 183407 DEBUG oslo_concurrency.processutils [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8c64a2e0-f723-4adb-84fc-867073a92349/disk.config --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:09:28 compute-1 nova_compute[183403]: 2026-01-26 15:09:28.645 183407 DEBUG oslo_concurrency.processutils [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8c64a2e0-f723-4adb-84fc-867073a92349/disk.config --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:09:28 compute-1 nova_compute[183403]: 2026-01-26 15:09:28.646 183407 DEBUG oslo_concurrency.lockutils [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "/var/lib/nova/instances/8c64a2e0-f723-4adb-84fc-867073a92349/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:09:28 compute-1 nova_compute[183403]: 2026-01-26 15:09:28.646 183407 DEBUG oslo_concurrency.lockutils [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "/var/lib/nova/instances/8c64a2e0-f723-4adb-84fc-867073a92349/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:09:28 compute-1 nova_compute[183403]: 2026-01-26 15:09:28.647 183407 DEBUG oslo_concurrency.lockutils [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "/var/lib/nova/instances/8c64a2e0-f723-4adb-84fc-867073a92349/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:09:28 compute-1 nova_compute[183403]: 2026-01-26 15:09:28.649 183407 DEBUG nova.virt.libvirt.vif [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2026-01-26T15:08:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-737487078',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-737487078',id=4,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T15:08:27Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6377892a338d4a7cbe63cf30bd2c63ea',ramdisk_id='',reservation_id='r-hm0fd26d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='manager,reader,member,admin',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-280856547',owner_user_name='tempest-TestExecuteActionsViaActuator-280856547-project-admin'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T15:09:19Z,user_data=None,user_id='afb4f4811cb043dca89a8413c390ba3d',uuid=8c64a2e0-f723-4adb-84fc-867073a92349,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6dd62b2f-1957-4fa5-92d8-6a7d131f0d09", "address": "fa:16:3e:91:f3:28", "network": {"id": "d4a37c9f-5b64-4f94-80e9-126c911b1acf", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "vif_mac": "fa:16:3e:91:f3:28"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7dc5e9070a084dfcb543a08e87868f39", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6dd62b2f-19", "ovs_interfaceid": "6dd62b2f-1957-4fa5-92d8-6a7d131f0d09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 26 15:09:28 compute-1 nova_compute[183403]: 2026-01-26 15:09:28.649 183407 DEBUG nova.network.os_vif_util [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Converting VIF {"id": "6dd62b2f-1957-4fa5-92d8-6a7d131f0d09", "address": "fa:16:3e:91:f3:28", "network": {"id": "d4a37c9f-5b64-4f94-80e9-126c911b1acf", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "vif_mac": "fa:16:3e:91:f3:28"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7dc5e9070a084dfcb543a08e87868f39", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6dd62b2f-19", "ovs_interfaceid": "6dd62b2f-1957-4fa5-92d8-6a7d131f0d09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:09:28 compute-1 nova_compute[183403]: 2026-01-26 15:09:28.650 183407 DEBUG nova.network.os_vif_util [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:91:f3:28,bridge_name='br-int',has_traffic_filtering=True,id=6dd62b2f-1957-4fa5-92d8-6a7d131f0d09,network=Network(d4a37c9f-5b64-4f94-80e9-126c911b1acf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6dd62b2f-19') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:09:28 compute-1 nova_compute[183403]: 2026-01-26 15:09:28.652 183407 DEBUG nova.virt.libvirt.driver [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8c64a2e0-f723-4adb-84fc-867073a92349] End _get_guest_xml xml=<domain type="kvm">
Jan 26 15:09:28 compute-1 nova_compute[183403]:   <uuid>8c64a2e0-f723-4adb-84fc-867073a92349</uuid>
Jan 26 15:09:28 compute-1 nova_compute[183403]:   <name>instance-00000004</name>
Jan 26 15:09:28 compute-1 nova_compute[183403]:   <memory>196608</memory>
Jan 26 15:09:28 compute-1 nova_compute[183403]:   <vcpu>1</vcpu>
Jan 26 15:09:28 compute-1 nova_compute[183403]:   <metadata>
Jan 26 15:09:28 compute-1 nova_compute[183403]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 15:09:28 compute-1 nova_compute[183403]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 15:09:28 compute-1 nova_compute[183403]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-737487078</nova:name>
Jan 26 15:09:28 compute-1 nova_compute[183403]:       <nova:creationTime>2026-01-26 15:09:28</nova:creationTime>
Jan 26 15:09:28 compute-1 nova_compute[183403]:       <nova:flavor name="m1.micro" id="b6884b57-cf1a-443f-a7b9-2aea263b07fa">
Jan 26 15:09:28 compute-1 nova_compute[183403]:         <nova:memory>192</nova:memory>
Jan 26 15:09:28 compute-1 nova_compute[183403]:         <nova:disk>1</nova:disk>
Jan 26 15:09:28 compute-1 nova_compute[183403]:         <nova:swap>0</nova:swap>
Jan 26 15:09:28 compute-1 nova_compute[183403]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 15:09:28 compute-1 nova_compute[183403]:         <nova:vcpus>1</nova:vcpus>
Jan 26 15:09:28 compute-1 nova_compute[183403]:         <nova:extraSpecs>
Jan 26 15:09:28 compute-1 nova_compute[183403]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 15:09:28 compute-1 nova_compute[183403]:         </nova:extraSpecs>
Jan 26 15:09:28 compute-1 nova_compute[183403]:       </nova:flavor>
Jan 26 15:09:28 compute-1 nova_compute[183403]:       <nova:image uuid="354e4d0e-4287-404f-93d3-2c85cfe92fbc">
Jan 26 15:09:28 compute-1 nova_compute[183403]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 15:09:28 compute-1 nova_compute[183403]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 15:09:28 compute-1 nova_compute[183403]:         <nova:minDisk>1</nova:minDisk>
Jan 26 15:09:28 compute-1 nova_compute[183403]:         <nova:minRam>0</nova:minRam>
Jan 26 15:09:28 compute-1 nova_compute[183403]:         <nova:properties>
Jan 26 15:09:28 compute-1 nova_compute[183403]:           <nova:property name="hw_cdrom_bus">sata</nova:property>
Jan 26 15:09:28 compute-1 nova_compute[183403]:           <nova:property name="hw_disk_bus">virtio</nova:property>
Jan 26 15:09:28 compute-1 nova_compute[183403]:           <nova:property name="hw_input_bus">usb</nova:property>
Jan 26 15:09:28 compute-1 nova_compute[183403]:           <nova:property name="hw_machine_type">q35</nova:property>
Jan 26 15:09:28 compute-1 nova_compute[183403]:           <nova:property name="hw_pointer_model">usbtablet</nova:property>
Jan 26 15:09:28 compute-1 nova_compute[183403]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 15:09:28 compute-1 nova_compute[183403]:           <nova:property name="hw_video_model">virtio</nova:property>
Jan 26 15:09:28 compute-1 nova_compute[183403]:           <nova:property name="hw_vif_model">virtio</nova:property>
Jan 26 15:09:28 compute-1 nova_compute[183403]:         </nova:properties>
Jan 26 15:09:28 compute-1 nova_compute[183403]:       </nova:image>
Jan 26 15:09:28 compute-1 nova_compute[183403]:       <nova:owner>
Jan 26 15:09:28 compute-1 nova_compute[183403]:         <nova:user uuid="afb4f4811cb043dca89a8413c390ba3d">tempest-TestExecuteActionsViaActuator-280856547-project-admin</nova:user>
Jan 26 15:09:28 compute-1 nova_compute[183403]:         <nova:project uuid="6377892a338d4a7cbe63cf30bd2c63ea">tempest-TestExecuteActionsViaActuator-280856547</nova:project>
Jan 26 15:09:28 compute-1 nova_compute[183403]:       </nova:owner>
Jan 26 15:09:28 compute-1 nova_compute[183403]:       <nova:root type="image" uuid="354e4d0e-4287-404f-93d3-2c85cfe92fbc"/>
Jan 26 15:09:28 compute-1 nova_compute[183403]:       <nova:ports>
Jan 26 15:09:28 compute-1 nova_compute[183403]:         <nova:port uuid="6dd62b2f-1957-4fa5-92d8-6a7d131f0d09">
Jan 26 15:09:28 compute-1 nova_compute[183403]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 26 15:09:28 compute-1 nova_compute[183403]:         </nova:port>
Jan 26 15:09:28 compute-1 nova_compute[183403]:       </nova:ports>
Jan 26 15:09:28 compute-1 nova_compute[183403]:     </nova:instance>
Jan 26 15:09:28 compute-1 nova_compute[183403]:   </metadata>
Jan 26 15:09:28 compute-1 nova_compute[183403]:   <sysinfo type="smbios">
Jan 26 15:09:28 compute-1 nova_compute[183403]:     <system>
Jan 26 15:09:28 compute-1 nova_compute[183403]:       <entry name="manufacturer">RDO</entry>
Jan 26 15:09:28 compute-1 nova_compute[183403]:       <entry name="product">OpenStack Compute</entry>
Jan 26 15:09:28 compute-1 nova_compute[183403]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 15:09:28 compute-1 nova_compute[183403]:       <entry name="serial">8c64a2e0-f723-4adb-84fc-867073a92349</entry>
Jan 26 15:09:28 compute-1 nova_compute[183403]:       <entry name="uuid">8c64a2e0-f723-4adb-84fc-867073a92349</entry>
Jan 26 15:09:28 compute-1 nova_compute[183403]:       <entry name="family">Virtual Machine</entry>
Jan 26 15:09:28 compute-1 nova_compute[183403]:     </system>
Jan 26 15:09:28 compute-1 nova_compute[183403]:   </sysinfo>
Jan 26 15:09:28 compute-1 nova_compute[183403]:   <os>
Jan 26 15:09:28 compute-1 nova_compute[183403]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 15:09:28 compute-1 nova_compute[183403]:     <boot dev="hd"/>
Jan 26 15:09:28 compute-1 nova_compute[183403]:     <smbios mode="sysinfo"/>
Jan 26 15:09:28 compute-1 nova_compute[183403]:   </os>
Jan 26 15:09:28 compute-1 nova_compute[183403]:   <features>
Jan 26 15:09:28 compute-1 nova_compute[183403]:     <acpi/>
Jan 26 15:09:28 compute-1 nova_compute[183403]:     <apic/>
Jan 26 15:09:28 compute-1 nova_compute[183403]:     <vmcoreinfo/>
Jan 26 15:09:28 compute-1 nova_compute[183403]:   </features>
Jan 26 15:09:28 compute-1 nova_compute[183403]:   <clock offset="utc">
Jan 26 15:09:28 compute-1 nova_compute[183403]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 15:09:28 compute-1 nova_compute[183403]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 15:09:28 compute-1 nova_compute[183403]:     <timer name="hpet" present="no"/>
Jan 26 15:09:28 compute-1 nova_compute[183403]:   </clock>
Jan 26 15:09:28 compute-1 nova_compute[183403]:   <cpu mode="custom" match="exact">
Jan 26 15:09:28 compute-1 nova_compute[183403]:     <model>Nehalem</model>
Jan 26 15:09:28 compute-1 nova_compute[183403]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 15:09:28 compute-1 nova_compute[183403]:   </cpu>
Jan 26 15:09:28 compute-1 nova_compute[183403]:   <devices>
Jan 26 15:09:28 compute-1 nova_compute[183403]:     <disk type="file" device="disk">
Jan 26 15:09:28 compute-1 nova_compute[183403]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 15:09:28 compute-1 nova_compute[183403]:       <source file="/var/lib/nova/instances/8c64a2e0-f723-4adb-84fc-867073a92349/disk"/>
Jan 26 15:09:28 compute-1 nova_compute[183403]:       <target dev="vda" bus="virtio"/>
Jan 26 15:09:28 compute-1 nova_compute[183403]:     </disk>
Jan 26 15:09:28 compute-1 nova_compute[183403]:     <disk type="file" device="cdrom">
Jan 26 15:09:28 compute-1 nova_compute[183403]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 15:09:28 compute-1 nova_compute[183403]:       <source file="/var/lib/nova/instances/8c64a2e0-f723-4adb-84fc-867073a92349/disk.config"/>
Jan 26 15:09:28 compute-1 nova_compute[183403]:       <target dev="sda" bus="sata"/>
Jan 26 15:09:28 compute-1 nova_compute[183403]:     </disk>
Jan 26 15:09:28 compute-1 nova_compute[183403]:     <interface type="ethernet">
Jan 26 15:09:28 compute-1 nova_compute[183403]:       <mac address="fa:16:3e:91:f3:28"/>
Jan 26 15:09:28 compute-1 nova_compute[183403]:       <model type="virtio"/>
Jan 26 15:09:28 compute-1 nova_compute[183403]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 15:09:28 compute-1 nova_compute[183403]:       <mtu size="1442"/>
Jan 26 15:09:28 compute-1 nova_compute[183403]:       <target dev="tap6dd62b2f-19"/>
Jan 26 15:09:28 compute-1 nova_compute[183403]:     </interface>
Jan 26 15:09:28 compute-1 nova_compute[183403]:     <serial type="pty">
Jan 26 15:09:28 compute-1 nova_compute[183403]:       <log file="/var/lib/nova/instances/8c64a2e0-f723-4adb-84fc-867073a92349/console.log" append="off"/>
Jan 26 15:09:28 compute-1 nova_compute[183403]:     </serial>
Jan 26 15:09:28 compute-1 nova_compute[183403]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 15:09:28 compute-1 nova_compute[183403]:     <video>
Jan 26 15:09:28 compute-1 nova_compute[183403]:       <model type="virtio"/>
Jan 26 15:09:28 compute-1 nova_compute[183403]:     </video>
Jan 26 15:09:28 compute-1 nova_compute[183403]:     <input type="tablet" bus="usb"/>
Jan 26 15:09:28 compute-1 nova_compute[183403]:     <rng model="virtio">
Jan 26 15:09:28 compute-1 nova_compute[183403]:       <backend model="random">/dev/urandom</backend>
Jan 26 15:09:28 compute-1 nova_compute[183403]:     </rng>
Jan 26 15:09:28 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root"/>
Jan 26 15:09:28 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:09:28 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:09:28 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:09:28 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:09:28 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:09:28 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:09:28 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:09:28 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:09:28 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:09:28 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:09:28 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:09:28 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:09:28 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:09:28 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:09:28 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:09:28 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:09:28 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:09:28 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:09:28 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:09:28 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:09:28 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:09:28 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:09:28 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:09:28 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:09:28 compute-1 nova_compute[183403]:     <controller type="usb" index="0"/>
Jan 26 15:09:28 compute-1 nova_compute[183403]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 15:09:28 compute-1 nova_compute[183403]:       <stats period="10"/>
Jan 26 15:09:28 compute-1 nova_compute[183403]:     </memballoon>
Jan 26 15:09:28 compute-1 nova_compute[183403]:   </devices>
Jan 26 15:09:28 compute-1 nova_compute[183403]: </domain>
Jan 26 15:09:28 compute-1 nova_compute[183403]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Jan 26 15:09:28 compute-1 nova_compute[183403]: 2026-01-26 15:09:28.653 183407 DEBUG nova.virt.libvirt.vif [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2026-01-26T15:08:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-737487078',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-737487078',id=4,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T15:08:27Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6377892a338d4a7cbe63cf30bd2c63ea',ramdisk_id='',reservation_id='r-hm0fd26d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='manager,reader,member,admin',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-280856547',owner_user_name='tempest-TestExecuteActionsViaActuator-280856547-project-admin'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T15:09:19Z,user_data=None,user_id='afb4f4811cb043dca89a8413c390ba3d',uuid=8c64a2e0-f723-4adb-84fc-867073a92349,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6dd62b2f-1957-4fa5-92d8-6a7d131f0d09", "address": "fa:16:3e:91:f3:28", "network": {"id": "d4a37c9f-5b64-4f94-80e9-126c911b1acf", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "vif_mac": "fa:16:3e:91:f3:28"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7dc5e9070a084dfcb543a08e87868f39", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6dd62b2f-19", "ovs_interfaceid": "6dd62b2f-1957-4fa5-92d8-6a7d131f0d09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 26 15:09:28 compute-1 nova_compute[183403]: 2026-01-26 15:09:28.653 183407 DEBUG nova.network.os_vif_util [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Converting VIF {"id": "6dd62b2f-1957-4fa5-92d8-6a7d131f0d09", "address": "fa:16:3e:91:f3:28", "network": {"id": "d4a37c9f-5b64-4f94-80e9-126c911b1acf", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "vif_mac": "fa:16:3e:91:f3:28"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7dc5e9070a084dfcb543a08e87868f39", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6dd62b2f-19", "ovs_interfaceid": "6dd62b2f-1957-4fa5-92d8-6a7d131f0d09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:09:28 compute-1 nova_compute[183403]: 2026-01-26 15:09:28.653 183407 DEBUG nova.network.os_vif_util [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:91:f3:28,bridge_name='br-int',has_traffic_filtering=True,id=6dd62b2f-1957-4fa5-92d8-6a7d131f0d09,network=Network(d4a37c9f-5b64-4f94-80e9-126c911b1acf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6dd62b2f-19') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:09:28 compute-1 nova_compute[183403]: 2026-01-26 15:09:28.654 183407 DEBUG os_vif [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:91:f3:28,bridge_name='br-int',has_traffic_filtering=True,id=6dd62b2f-1957-4fa5-92d8-6a7d131f0d09,network=Network(d4a37c9f-5b64-4f94-80e9-126c911b1acf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6dd62b2f-19') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 26 15:09:28 compute-1 nova_compute[183403]: 2026-01-26 15:09:28.654 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:09:28 compute-1 nova_compute[183403]: 2026-01-26 15:09:28.655 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:09:28 compute-1 nova_compute[183403]: 2026-01-26 15:09:28.655 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:09:28 compute-1 nova_compute[183403]: 2026-01-26 15:09:28.656 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:09:28 compute-1 nova_compute[183403]: 2026-01-26 15:09:28.656 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'c49cf571-c29f-5d30-a26b-19510f738607', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:09:28 compute-1 nova_compute[183403]: 2026-01-26 15:09:28.657 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:09:28 compute-1 nova_compute[183403]: 2026-01-26 15:09:28.658 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:09:28 compute-1 nova_compute[183403]: 2026-01-26 15:09:28.663 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:09:28 compute-1 nova_compute[183403]: 2026-01-26 15:09:28.663 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6dd62b2f-19, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:09:28 compute-1 nova_compute[183403]: 2026-01-26 15:09:28.664 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap6dd62b2f-19, col_values=(('qos', UUID('c1eb6b98-b987-44d8-ad5e-8de75dbf5921')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:09:28 compute-1 nova_compute[183403]: 2026-01-26 15:09:28.664 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap6dd62b2f-19, col_values=(('external_ids', {'iface-id': '6dd62b2f-1957-4fa5-92d8-6a7d131f0d09', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:91:f3:28', 'vm-uuid': '8c64a2e0-f723-4adb-84fc-867073a92349'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:09:28 compute-1 nova_compute[183403]: 2026-01-26 15:09:28.665 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:09:28 compute-1 NetworkManager[55716]: <info>  [1769440168.6664] manager: (tap6dd62b2f-19): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/28)
Jan 26 15:09:28 compute-1 nova_compute[183403]: 2026-01-26 15:09:28.667 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:09:28 compute-1 nova_compute[183403]: 2026-01-26 15:09:28.673 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:09:28 compute-1 nova_compute[183403]: 2026-01-26 15:09:28.674 183407 INFO os_vif [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:91:f3:28,bridge_name='br-int',has_traffic_filtering=True,id=6dd62b2f-1957-4fa5-92d8-6a7d131f0d09,network=Network(d4a37c9f-5b64-4f94-80e9-126c911b1acf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6dd62b2f-19')
Jan 26 15:09:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:09:29.029 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:09:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:09:29.030 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:09:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:09:29.031 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:09:29 compute-1 systemd[1]: Stopping User Manager for UID 42436...
Jan 26 15:09:29 compute-1 systemd[205515]: Activating special unit Exit the Session...
Jan 26 15:09:29 compute-1 systemd[205515]: Stopped target Main User Target.
Jan 26 15:09:29 compute-1 systemd[205515]: Stopped target Basic System.
Jan 26 15:09:29 compute-1 systemd[205515]: Stopped target Paths.
Jan 26 15:09:29 compute-1 systemd[205515]: Stopped target Sockets.
Jan 26 15:09:29 compute-1 systemd[205515]: Stopped target Timers.
Jan 26 15:09:29 compute-1 systemd[205515]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 26 15:09:29 compute-1 systemd[205515]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 26 15:09:29 compute-1 systemd[205515]: Closed D-Bus User Message Bus Socket.
Jan 26 15:09:29 compute-1 systemd[205515]: Stopped Create User's Volatile Files and Directories.
Jan 26 15:09:29 compute-1 systemd[205515]: Removed slice User Application Slice.
Jan 26 15:09:29 compute-1 systemd[205515]: Reached target Shutdown.
Jan 26 15:09:29 compute-1 systemd[205515]: Finished Exit the Session.
Jan 26 15:09:29 compute-1 systemd[205515]: Reached target Exit the Session.
Jan 26 15:09:29 compute-1 nova_compute[183403]: 2026-01-26 15:09:29.056 183407 DEBUG oslo_concurrency.lockutils [req-433e2b65-8c0a-4eb0-a843-b3eac53c2db8 req-d9ba49c6-d641-4dd9-a4c2-d8a6c3355704 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Releasing lock "refresh_cache-8c64a2e0-f723-4adb-84fc-867073a92349" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 15:09:29 compute-1 systemd[1]: user@42436.service: Deactivated successfully.
Jan 26 15:09:29 compute-1 systemd[1]: Stopped User Manager for UID 42436.
Jan 26 15:09:29 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 26 15:09:29 compute-1 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 26 15:09:29 compute-1 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 26 15:09:29 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 26 15:09:29 compute-1 systemd[1]: Removed slice User Slice of UID 42436.
Jan 26 15:09:30 compute-1 nova_compute[183403]: 2026-01-26 15:09:30.221 183407 DEBUG nova.virt.libvirt.driver [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 15:09:30 compute-1 nova_compute[183403]: 2026-01-26 15:09:30.222 183407 DEBUG nova.virt.libvirt.driver [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 15:09:30 compute-1 nova_compute[183403]: 2026-01-26 15:09:30.222 183407 DEBUG nova.virt.libvirt.driver [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] No VIF found with MAC fa:16:3e:91:f3:28, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Jan 26 15:09:30 compute-1 nova_compute[183403]: 2026-01-26 15:09:30.224 183407 INFO nova.virt.libvirt.driver [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8c64a2e0-f723-4adb-84fc-867073a92349] Using config drive
Jan 26 15:09:30 compute-1 kernel: tap6dd62b2f-19: entered promiscuous mode
Jan 26 15:09:30 compute-1 NetworkManager[55716]: <info>  [1769440170.2904] manager: (tap6dd62b2f-19): new Tun device (/org/freedesktop/NetworkManager/Devices/29)
Jan 26 15:09:30 compute-1 nova_compute[183403]: 2026-01-26 15:09:30.294 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:09:30 compute-1 ovn_controller[95641]: 2026-01-26T15:09:30Z|00054|binding|INFO|Claiming lport 6dd62b2f-1957-4fa5-92d8-6a7d131f0d09 for this chassis.
Jan 26 15:09:30 compute-1 ovn_controller[95641]: 2026-01-26T15:09:30Z|00055|binding|INFO|6dd62b2f-1957-4fa5-92d8-6a7d131f0d09: Claiming fa:16:3e:91:f3:28 10.100.0.8
Jan 26 15:09:30 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:09:30.321 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:91:f3:28 10.100.0.8'], port_security=['fa:16:3e:91:f3:28 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '8c64a2e0-f723-4adb-84fc-867073a92349', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d4a37c9f-5b64-4f94-80e9-126c911b1acf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6377892a338d4a7cbe63cf30bd2c63ea', 'neutron:revision_number': '9', 'neutron:security_group_ids': '6ec487f2-f407-43f7-8fd3-02f4d5e73158', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=922d1c2a-bc46-47ee-81d5-242719303ef7, chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], logical_port=6dd62b2f-1957-4fa5-92d8-6a7d131f0d09) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:09:30 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:09:30.322 104930 INFO neutron.agent.ovn.metadata.agent [-] Port 6dd62b2f-1957-4fa5-92d8-6a7d131f0d09 in datapath d4a37c9f-5b64-4f94-80e9-126c911b1acf bound to our chassis
Jan 26 15:09:30 compute-1 nova_compute[183403]: 2026-01-26 15:09:30.323 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:09:30 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:09:30.325 104930 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d4a37c9f-5b64-4f94-80e9-126c911b1acf
Jan 26 15:09:30 compute-1 ovn_controller[95641]: 2026-01-26T15:09:30Z|00056|binding|INFO|Setting lport 6dd62b2f-1957-4fa5-92d8-6a7d131f0d09 ovn-installed in OVS
Jan 26 15:09:30 compute-1 ovn_controller[95641]: 2026-01-26T15:09:30Z|00057|binding|INFO|Setting lport 6dd62b2f-1957-4fa5-92d8-6a7d131f0d09 up in Southbound
Jan 26 15:09:30 compute-1 nova_compute[183403]: 2026-01-26 15:09:30.328 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:09:30 compute-1 systemd-udevd[205670]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 15:09:30 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:09:30.344 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[64ba9372-81c4-4241-b7fb-cf6297ac6eac]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:09:30 compute-1 NetworkManager[55716]: <info>  [1769440170.3567] device (tap6dd62b2f-19): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 15:09:30 compute-1 NetworkManager[55716]: <info>  [1769440170.3578] device (tap6dd62b2f-19): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 15:09:30 compute-1 systemd-machined[154697]: New machine qemu-3-instance-00000004.
Jan 26 15:09:30 compute-1 systemd[1]: Started Virtual Machine qemu-3-instance-00000004.
Jan 26 15:09:30 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:09:30.389 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[a1c15169-e5a7-4f7a-a996-e1612b97f2a6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:09:30 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:09:30.393 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[3be376de-b3f8-47b1-9930-d4e842d52727]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:09:30 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:09:30.429 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[d1d8ba8f-2d23-48fc-b083-645c03d947da]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:09:30 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:09:30.448 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[6afe29cb-8406-47fd-a383-d19bfb737c09]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd4a37c9f-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:53:55:45'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 385314, 'reachable_time': 28113, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 205685, 'error': None, 'target': 'ovnmeta-d4a37c9f-5b64-4f94-80e9-126c911b1acf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:09:30 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:09:30.469 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[aff1dcac-c666-481c-bdbd-043350fdbe7a]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd4a37c9f-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 385324, 'tstamp': 385324}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 205687, 'error': None, 'target': 'ovnmeta-d4a37c9f-5b64-4f94-80e9-126c911b1acf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd4a37c9f-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 385327, 'tstamp': 385327}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 205687, 'error': None, 'target': 'ovnmeta-d4a37c9f-5b64-4f94-80e9-126c911b1acf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:09:30 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:09:30.471 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd4a37c9f-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:09:30 compute-1 nova_compute[183403]: 2026-01-26 15:09:30.473 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:09:30 compute-1 nova_compute[183403]: 2026-01-26 15:09:30.474 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:09:30 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:09:30.474 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd4a37c9f-50, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:09:30 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:09:30.474 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:09:30 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:09:30.475 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd4a37c9f-50, col_values=(('external_ids', {'iface-id': '3415b7f1-5b64-48d1-b20f-4c68422efc0e'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:09:30 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:09:30.475 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:09:30 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:09:30.476 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[ac2ae501-bf59-4ced-bc28-919aadcb954a]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-d4a37c9f-5b64-4f94-80e9-126c911b1acf\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/d4a37c9f-5b64-4f94-80e9-126c911b1acf.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID d4a37c9f-5b64-4f94-80e9-126c911b1acf\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:09:30 compute-1 sshd-session[205596]: Invalid user backup from 185.246.128.170 port 42412
Jan 26 15:09:30 compute-1 nova_compute[183403]: 2026-01-26 15:09:30.893 183407 DEBUG nova.compute.manager [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8c64a2e0-f723-4adb-84fc-867073a92349] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 26 15:09:30 compute-1 nova_compute[183403]: 2026-01-26 15:09:30.897 183407 INFO nova.virt.libvirt.driver [-] [instance: 8c64a2e0-f723-4adb-84fc-867073a92349] Instance running successfully.
Jan 26 15:09:30 compute-1 virtqemud[183290]: argument unsupported: QEMU guest agent is not configured
Jan 26 15:09:30 compute-1 nova_compute[183403]: 2026-01-26 15:09:30.900 183407 DEBUG nova.virt.libvirt.guest [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8c64a2e0-f723-4adb-84fc-867073a92349] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:200
Jan 26 15:09:30 compute-1 nova_compute[183403]: 2026-01-26 15:09:30.900 183407 DEBUG nova.virt.libvirt.driver [None req-897af98e-24a1-417a-8660-49a9efef445e a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8c64a2e0-f723-4adb-84fc-867073a92349] finish_migration finished successfully. finish_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12699
Jan 26 15:09:31 compute-1 nova_compute[183403]: 2026-01-26 15:09:31.175 183407 DEBUG nova.compute.manager [req-3b821709-5ca4-43c4-ba19-ede4ac9a853e req-3d90a2b3-d21f-4753-80d9-b25d343c1a11 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8c64a2e0-f723-4adb-84fc-867073a92349] Received event network-vif-plugged-6dd62b2f-1957-4fa5-92d8-6a7d131f0d09 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:09:31 compute-1 nova_compute[183403]: 2026-01-26 15:09:31.176 183407 DEBUG oslo_concurrency.lockutils [req-3b821709-5ca4-43c4-ba19-ede4ac9a853e req-3d90a2b3-d21f-4753-80d9-b25d343c1a11 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "8c64a2e0-f723-4adb-84fc-867073a92349-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:09:31 compute-1 nova_compute[183403]: 2026-01-26 15:09:31.176 183407 DEBUG oslo_concurrency.lockutils [req-3b821709-5ca4-43c4-ba19-ede4ac9a853e req-3d90a2b3-d21f-4753-80d9-b25d343c1a11 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "8c64a2e0-f723-4adb-84fc-867073a92349-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:09:31 compute-1 nova_compute[183403]: 2026-01-26 15:09:31.177 183407 DEBUG oslo_concurrency.lockutils [req-3b821709-5ca4-43c4-ba19-ede4ac9a853e req-3d90a2b3-d21f-4753-80d9-b25d343c1a11 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "8c64a2e0-f723-4adb-84fc-867073a92349-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:09:31 compute-1 nova_compute[183403]: 2026-01-26 15:09:31.177 183407 DEBUG nova.compute.manager [req-3b821709-5ca4-43c4-ba19-ede4ac9a853e req-3d90a2b3-d21f-4753-80d9-b25d343c1a11 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8c64a2e0-f723-4adb-84fc-867073a92349] No waiting events found dispatching network-vif-plugged-6dd62b2f-1957-4fa5-92d8-6a7d131f0d09 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:09:31 compute-1 nova_compute[183403]: 2026-01-26 15:09:31.179 183407 WARNING nova.compute.manager [req-3b821709-5ca4-43c4-ba19-ede4ac9a853e req-3d90a2b3-d21f-4753-80d9-b25d343c1a11 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8c64a2e0-f723-4adb-84fc-867073a92349] Received unexpected event network-vif-plugged-6dd62b2f-1957-4fa5-92d8-6a7d131f0d09 for instance with vm_state active and task_state resize_finish.
Jan 26 15:09:31 compute-1 sshd-session[205596]: Disconnecting invalid user backup 185.246.128.170 port 42412: Change of username or service not allowed: (backup,ssh-connection) -> (cirros,ssh-connection) [preauth]
Jan 26 15:09:33 compute-1 nova_compute[183403]: 2026-01-26 15:09:33.252 183407 DEBUG nova.compute.manager [req-445a6be9-334c-4e36-9c0e-a436706a7946 req-f7f0a706-8921-4b4c-a887-3b1bb1c19a46 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8c64a2e0-f723-4adb-84fc-867073a92349] Received event network-vif-plugged-6dd62b2f-1957-4fa5-92d8-6a7d131f0d09 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:09:33 compute-1 nova_compute[183403]: 2026-01-26 15:09:33.252 183407 DEBUG oslo_concurrency.lockutils [req-445a6be9-334c-4e36-9c0e-a436706a7946 req-f7f0a706-8921-4b4c-a887-3b1bb1c19a46 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "8c64a2e0-f723-4adb-84fc-867073a92349-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:09:33 compute-1 nova_compute[183403]: 2026-01-26 15:09:33.252 183407 DEBUG oslo_concurrency.lockutils [req-445a6be9-334c-4e36-9c0e-a436706a7946 req-f7f0a706-8921-4b4c-a887-3b1bb1c19a46 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "8c64a2e0-f723-4adb-84fc-867073a92349-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:09:33 compute-1 nova_compute[183403]: 2026-01-26 15:09:33.252 183407 DEBUG oslo_concurrency.lockutils [req-445a6be9-334c-4e36-9c0e-a436706a7946 req-f7f0a706-8921-4b4c-a887-3b1bb1c19a46 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "8c64a2e0-f723-4adb-84fc-867073a92349-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:09:33 compute-1 nova_compute[183403]: 2026-01-26 15:09:33.253 183407 DEBUG nova.compute.manager [req-445a6be9-334c-4e36-9c0e-a436706a7946 req-f7f0a706-8921-4b4c-a887-3b1bb1c19a46 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8c64a2e0-f723-4adb-84fc-867073a92349] No waiting events found dispatching network-vif-plugged-6dd62b2f-1957-4fa5-92d8-6a7d131f0d09 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:09:33 compute-1 nova_compute[183403]: 2026-01-26 15:09:33.253 183407 WARNING nova.compute.manager [req-445a6be9-334c-4e36-9c0e-a436706a7946 req-f7f0a706-8921-4b4c-a887-3b1bb1c19a46 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8c64a2e0-f723-4adb-84fc-867073a92349] Received unexpected event network-vif-plugged-6dd62b2f-1957-4fa5-92d8-6a7d131f0d09 for instance with vm_state resized and task_state None.
Jan 26 15:09:33 compute-1 nova_compute[183403]: 2026-01-26 15:09:33.387 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:09:33 compute-1 nova_compute[183403]: 2026-01-26 15:09:33.666 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:09:34 compute-1 sshd-session[205696]: Invalid user cirros from 185.246.128.170 port 19419
Jan 26 15:09:35 compute-1 sshd-session[205696]: Disconnecting invalid user cirros 185.246.128.170 port 19419: Change of username or service not allowed: (cirros,ssh-connection) -> (secret,ssh-connection) [preauth]
Jan 26 15:09:35 compute-1 podman[192725]: time="2026-01-26T15:09:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:09:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:09:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16574 "" "Go-http-client/1.1"
Jan 26 15:09:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:09:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2626 "" "Go-http-client/1.1"
Jan 26 15:09:38 compute-1 nova_compute[183403]: 2026-01-26 15:09:38.389 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:09:38 compute-1 nova_compute[183403]: 2026-01-26 15:09:38.668 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:09:43 compute-1 nova_compute[183403]: 2026-01-26 15:09:43.392 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:09:43 compute-1 nova_compute[183403]: 2026-01-26 15:09:43.670 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:09:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:09:44.080 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:ac:38', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'de:96:40:90:f3:e5'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:09:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:09:44.081 104930 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 15:09:44 compute-1 nova_compute[183403]: 2026-01-26 15:09:44.081 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:09:45 compute-1 ovn_controller[95641]: 2026-01-26T15:09:45Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:91:f3:28 10.100.0.8
Jan 26 15:09:46 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:09:46.082 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=41380b5a-e321-4ce4-bcc6-ecd563b3c793, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:09:46 compute-1 podman[205721]: 2026-01-26 15:09:46.889236221 +0000 UTC m=+0.065226722 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, com.redhat.component=ubi9-minimal-container, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., container_name=openstack_network_exporter, architecture=x86_64, distribution-scope=public, release=1755695350, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Jan 26 15:09:46 compute-1 podman[205720]: 2026-01-26 15:09:46.903248876 +0000 UTC m=+0.079204265 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 26 15:09:48 compute-1 nova_compute[183403]: 2026-01-26 15:09:48.394 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:09:48 compute-1 nova_compute[183403]: 2026-01-26 15:09:48.671 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:09:49 compute-1 openstack_network_exporter[195610]: ERROR   15:09:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:09:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:09:49 compute-1 openstack_network_exporter[195610]: ERROR   15:09:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:09:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:09:52 compute-1 sshd-session[205718]: Invalid user secret from 185.246.128.170 port 12609
Jan 26 15:09:53 compute-1 sshd-session[205718]: Disconnecting invalid user secret 185.246.128.170 port 12609: Change of username or service not allowed: (secret,ssh-connection) -> (amir,ssh-connection) [preauth]
Jan 26 15:09:53 compute-1 nova_compute[183403]: 2026-01-26 15:09:53.396 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:09:53 compute-1 nova_compute[183403]: 2026-01-26 15:09:53.673 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:09:56 compute-1 sshd-session[205767]: Invalid user sol from 80.94.92.171 port 60766
Jan 26 15:09:56 compute-1 sshd-session[205767]: Connection closed by invalid user sol 80.94.92.171 port 60766 [preauth]
Jan 26 15:09:58 compute-1 nova_compute[183403]: 2026-01-26 15:09:58.413 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:09:58 compute-1 nova_compute[183403]: 2026-01-26 15:09:58.463 183407 DEBUG oslo_concurrency.lockutils [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Acquiring lock "aab8c28e-0489-40bd-88cf-5eb7c419933a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:09:58 compute-1 nova_compute[183403]: 2026-01-26 15:09:58.463 183407 DEBUG oslo_concurrency.lockutils [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "aab8c28e-0489-40bd-88cf-5eb7c419933a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:09:58 compute-1 nova_compute[183403]: 2026-01-26 15:09:58.577 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:09:58 compute-1 nova_compute[183403]: 2026-01-26 15:09:58.707 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:09:58 compute-1 podman[205772]: 2026-01-26 15:09:58.889130608 +0000 UTC m=+0.065781044 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Jan 26 15:09:58 compute-1 podman[205771]: 2026-01-26 15:09:58.918074435 +0000 UTC m=+0.100613064 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_controller, io.buildah.version=1.41.4, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20260120)
Jan 26 15:09:58 compute-1 nova_compute[183403]: 2026-01-26 15:09:58.970 183407 DEBUG nova.compute.manager [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Jan 26 15:09:59 compute-1 nova_compute[183403]: 2026-01-26 15:09:59.575 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:09:59 compute-1 nova_compute[183403]: 2026-01-26 15:09:59.729 183407 DEBUG oslo_concurrency.lockutils [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:09:59 compute-1 nova_compute[183403]: 2026-01-26 15:09:59.730 183407 DEBUG oslo_concurrency.lockutils [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:09:59 compute-1 nova_compute[183403]: 2026-01-26 15:09:59.735 183407 DEBUG nova.virt.hardware [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Jan 26 15:09:59 compute-1 nova_compute[183403]: 2026-01-26 15:09:59.736 183407 INFO nova.compute.claims [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] Claim successful on node compute-1.ctlplane.example.com
Jan 26 15:10:00 compute-1 nova_compute[183403]: 2026-01-26 15:10:00.091 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:10:00 compute-1 nova_compute[183403]: 2026-01-26 15:10:00.828 183407 DEBUG nova.compute.provider_tree [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:10:01 compute-1 nova_compute[183403]: 2026-01-26 15:10:01.337 183407 DEBUG nova.scheduler.client.report [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:10:01 compute-1 nova_compute[183403]: 2026-01-26 15:10:01.848 183407 DEBUG oslo_concurrency.lockutils [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.118s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:10:01 compute-1 nova_compute[183403]: 2026-01-26 15:10:01.849 183407 DEBUG nova.compute.manager [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Jan 26 15:10:01 compute-1 nova_compute[183403]: 2026-01-26 15:10:01.851 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 1.760s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:10:01 compute-1 nova_compute[183403]: 2026-01-26 15:10:01.851 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:10:01 compute-1 nova_compute[183403]: 2026-01-26 15:10:01.851 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:10:02 compute-1 nova_compute[183403]: 2026-01-26 15:10:02.359 183407 DEBUG nova.compute.manager [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Jan 26 15:10:02 compute-1 nova_compute[183403]: 2026-01-26 15:10:02.359 183407 DEBUG nova.network.neutron [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Jan 26 15:10:02 compute-1 nova_compute[183403]: 2026-01-26 15:10:02.360 183407 WARNING neutronclient.v2_0.client [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:10:02 compute-1 nova_compute[183403]: 2026-01-26 15:10:02.360 183407 WARNING neutronclient.v2_0.client [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:10:02 compute-1 nova_compute[183403]: 2026-01-26 15:10:02.869 183407 INFO nova.virt.libvirt.driver [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 15:10:02 compute-1 nova_compute[183403]: 2026-01-26 15:10:02.892 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/66a7af21-1abe-467f-b739-441e05a4b09a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:10:02 compute-1 nova_compute[183403]: 2026-01-26 15:10:02.957 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/66a7af21-1abe-467f-b739-441e05a4b09a/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:10:02 compute-1 nova_compute[183403]: 2026-01-26 15:10:02.958 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/66a7af21-1abe-467f-b739-441e05a4b09a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:10:03 compute-1 nova_compute[183403]: 2026-01-26 15:10:03.013 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/66a7af21-1abe-467f-b739-441e05a4b09a/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:10:03 compute-1 nova_compute[183403]: 2026-01-26 15:10:03.020 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8c64a2e0-f723-4adb-84fc-867073a92349/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:10:03 compute-1 nova_compute[183403]: 2026-01-26 15:10:03.074 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8c64a2e0-f723-4adb-84fc-867073a92349/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:10:03 compute-1 nova_compute[183403]: 2026-01-26 15:10:03.075 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8c64a2e0-f723-4adb-84fc-867073a92349/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:10:03 compute-1 nova_compute[183403]: 2026-01-26 15:10:03.124 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8c64a2e0-f723-4adb-84fc-867073a92349/disk --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:10:03 compute-1 nova_compute[183403]: 2026-01-26 15:10:03.271 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:10:03 compute-1 nova_compute[183403]: 2026-01-26 15:10:03.272 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:10:03 compute-1 nova_compute[183403]: 2026-01-26 15:10:03.293 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.021s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:10:03 compute-1 nova_compute[183403]: 2026-01-26 15:10:03.294 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5488MB free_disk=73.09177017211914GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:10:03 compute-1 nova_compute[183403]: 2026-01-26 15:10:03.294 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:10:03 compute-1 nova_compute[183403]: 2026-01-26 15:10:03.294 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:10:03 compute-1 nova_compute[183403]: 2026-01-26 15:10:03.376 183407 DEBUG nova.compute.manager [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Jan 26 15:10:03 compute-1 nova_compute[183403]: 2026-01-26 15:10:03.415 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:10:03 compute-1 nova_compute[183403]: 2026-01-26 15:10:03.710 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:10:04 compute-1 nova_compute[183403]: 2026-01-26 15:10:04.065 183407 DEBUG nova.network.neutron [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] Successfully created port: beb27c83-cf29-484e-a6e9-e9c6a978afd5 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Jan 26 15:10:04 compute-1 nova_compute[183403]: 2026-01-26 15:10:04.339 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Instance 66a7af21-1abe-467f-b739-441e05a4b09a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 15:10:04 compute-1 nova_compute[183403]: 2026-01-26 15:10:04.340 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Instance 8c64a2e0-f723-4adb-84fc-867073a92349 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 15:10:04 compute-1 nova_compute[183403]: 2026-01-26 15:10:04.340 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Instance aab8c28e-0489-40bd-88cf-5eb7c419933a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 15:10:04 compute-1 nova_compute[183403]: 2026-01-26 15:10:04.340 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:10:04 compute-1 nova_compute[183403]: 2026-01-26 15:10:04.341 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=960MB phys_disk=79GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:10:03 up  1:05,  0 user,  load average: 0.33, 0.21, 0.37\n', 'num_instances': '3', 'num_vm_active': '2', 'num_task_None': '2', 'num_os_type_None': '3', 'num_proj_6377892a338d4a7cbe63cf30bd2c63ea': '3', 'io_workload': '1', 'num_vm_building': '1', 'num_task_networking': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:10:04 compute-1 nova_compute[183403]: 2026-01-26 15:10:04.397 183407 DEBUG nova.compute.manager [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Jan 26 15:10:04 compute-1 nova_compute[183403]: 2026-01-26 15:10:04.399 183407 DEBUG nova.virt.libvirt.driver [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Jan 26 15:10:04 compute-1 nova_compute[183403]: 2026-01-26 15:10:04.400 183407 INFO nova.virt.libvirt.driver [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] Creating image(s)
Jan 26 15:10:04 compute-1 nova_compute[183403]: 2026-01-26 15:10:04.400 183407 DEBUG oslo_concurrency.lockutils [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Acquiring lock "/var/lib/nova/instances/aab8c28e-0489-40bd-88cf-5eb7c419933a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:10:04 compute-1 nova_compute[183403]: 2026-01-26 15:10:04.401 183407 DEBUG oslo_concurrency.lockutils [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "/var/lib/nova/instances/aab8c28e-0489-40bd-88cf-5eb7c419933a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:10:04 compute-1 nova_compute[183403]: 2026-01-26 15:10:04.401 183407 DEBUG oslo_concurrency.lockutils [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "/var/lib/nova/instances/aab8c28e-0489-40bd-88cf-5eb7c419933a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:10:04 compute-1 nova_compute[183403]: 2026-01-26 15:10:04.402 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:10:04 compute-1 nova_compute[183403]: 2026-01-26 15:10:04.405 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:10:04 compute-1 nova_compute[183403]: 2026-01-26 15:10:04.407 183407 DEBUG oslo_concurrency.processutils [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:10:04 compute-1 nova_compute[183403]: 2026-01-26 15:10:04.426 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:10:04 compute-1 nova_compute[183403]: 2026-01-26 15:10:04.487 183407 DEBUG oslo_concurrency.processutils [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:10:04 compute-1 nova_compute[183403]: 2026-01-26 15:10:04.487 183407 DEBUG oslo_concurrency.lockutils [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Acquiring lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:10:04 compute-1 nova_compute[183403]: 2026-01-26 15:10:04.488 183407 DEBUG oslo_concurrency.lockutils [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:10:04 compute-1 nova_compute[183403]: 2026-01-26 15:10:04.489 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:10:04 compute-1 nova_compute[183403]: 2026-01-26 15:10:04.491 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:10:04 compute-1 nova_compute[183403]: 2026-01-26 15:10:04.492 183407 DEBUG oslo_concurrency.processutils [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:10:04 compute-1 nova_compute[183403]: 2026-01-26 15:10:04.556 183407 DEBUG oslo_concurrency.processutils [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:10:04 compute-1 nova_compute[183403]: 2026-01-26 15:10:04.557 183407 DEBUG oslo_concurrency.processutils [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0,backing_fmt=raw /var/lib/nova/instances/aab8c28e-0489-40bd-88cf-5eb7c419933a/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:10:04 compute-1 nova_compute[183403]: 2026-01-26 15:10:04.601 183407 DEBUG oslo_concurrency.processutils [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0,backing_fmt=raw /var/lib/nova/instances/aab8c28e-0489-40bd-88cf-5eb7c419933a/disk 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:10:04 compute-1 nova_compute[183403]: 2026-01-26 15:10:04.602 183407 DEBUG oslo_concurrency.lockutils [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.114s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:10:04 compute-1 nova_compute[183403]: 2026-01-26 15:10:04.602 183407 DEBUG oslo_concurrency.processutils [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:10:04 compute-1 nova_compute[183403]: 2026-01-26 15:10:04.674 183407 DEBUG oslo_concurrency.processutils [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:10:04 compute-1 nova_compute[183403]: 2026-01-26 15:10:04.675 183407 DEBUG nova.virt.disk.api [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Checking if we can resize image /var/lib/nova/instances/aab8c28e-0489-40bd-88cf-5eb7c419933a/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 26 15:10:04 compute-1 nova_compute[183403]: 2026-01-26 15:10:04.676 183407 DEBUG oslo_concurrency.processutils [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/aab8c28e-0489-40bd-88cf-5eb7c419933a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:10:04 compute-1 nova_compute[183403]: 2026-01-26 15:10:04.731 183407 DEBUG oslo_concurrency.processutils [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/aab8c28e-0489-40bd-88cf-5eb7c419933a/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:10:04 compute-1 nova_compute[183403]: 2026-01-26 15:10:04.732 183407 DEBUG nova.virt.disk.api [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Cannot resize image /var/lib/nova/instances/aab8c28e-0489-40bd-88cf-5eb7c419933a/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 26 15:10:04 compute-1 nova_compute[183403]: 2026-01-26 15:10:04.733 183407 DEBUG nova.virt.libvirt.driver [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Jan 26 15:10:04 compute-1 nova_compute[183403]: 2026-01-26 15:10:04.733 183407 DEBUG nova.virt.libvirt.driver [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] Ensure instance console log exists: /var/lib/nova/instances/aab8c28e-0489-40bd-88cf-5eb7c419933a/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Jan 26 15:10:04 compute-1 nova_compute[183403]: 2026-01-26 15:10:04.733 183407 DEBUG oslo_concurrency.lockutils [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:10:04 compute-1 nova_compute[183403]: 2026-01-26 15:10:04.734 183407 DEBUG oslo_concurrency.lockutils [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:10:04 compute-1 nova_compute[183403]: 2026-01-26 15:10:04.734 183407 DEBUG oslo_concurrency.lockutils [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:10:05 compute-1 nova_compute[183403]: 2026-01-26 15:10:05.014 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:10:05 compute-1 nova_compute[183403]: 2026-01-26 15:10:05.607 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:10:05 compute-1 nova_compute[183403]: 2026-01-26 15:10:05.608 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.314s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:10:05 compute-1 podman[192725]: time="2026-01-26T15:10:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:10:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:10:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16574 "" "Go-http-client/1.1"
Jan 26 15:10:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:10:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2631 "" "Go-http-client/1.1"
Jan 26 15:10:05 compute-1 sshd-session[205769]: Invalid user amir from 185.246.128.170 port 4341
Jan 26 15:10:06 compute-1 nova_compute[183403]: 2026-01-26 15:10:06.125 183407 DEBUG nova.network.neutron [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] Successfully updated port: beb27c83-cf29-484e-a6e9-e9c6a978afd5 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Jan 26 15:10:06 compute-1 nova_compute[183403]: 2026-01-26 15:10:06.330 183407 DEBUG nova.compute.manager [req-358d53d2-59b7-4c2d-a87a-1be7b70b8e20 req-c6d7f686-8bd3-42e5-b62b-4698747a7d2a 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] Received event network-changed-beb27c83-cf29-484e-a6e9-e9c6a978afd5 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:10:06 compute-1 nova_compute[183403]: 2026-01-26 15:10:06.330 183407 DEBUG nova.compute.manager [req-358d53d2-59b7-4c2d-a87a-1be7b70b8e20 req-c6d7f686-8bd3-42e5-b62b-4698747a7d2a 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] Refreshing instance network info cache due to event network-changed-beb27c83-cf29-484e-a6e9-e9c6a978afd5. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 26 15:10:06 compute-1 nova_compute[183403]: 2026-01-26 15:10:06.331 183407 DEBUG oslo_concurrency.lockutils [req-358d53d2-59b7-4c2d-a87a-1be7b70b8e20 req-c6d7f686-8bd3-42e5-b62b-4698747a7d2a 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "refresh_cache-aab8c28e-0489-40bd-88cf-5eb7c419933a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 15:10:06 compute-1 nova_compute[183403]: 2026-01-26 15:10:06.331 183407 DEBUG oslo_concurrency.lockutils [req-358d53d2-59b7-4c2d-a87a-1be7b70b8e20 req-c6d7f686-8bd3-42e5-b62b-4698747a7d2a 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquired lock "refresh_cache-aab8c28e-0489-40bd-88cf-5eb7c419933a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 15:10:06 compute-1 nova_compute[183403]: 2026-01-26 15:10:06.331 183407 DEBUG nova.network.neutron [req-358d53d2-59b7-4c2d-a87a-1be7b70b8e20 req-c6d7f686-8bd3-42e5-b62b-4698747a7d2a 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] Refreshing network info cache for port beb27c83-cf29-484e-a6e9-e9c6a978afd5 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 26 15:10:06 compute-1 nova_compute[183403]: 2026-01-26 15:10:06.608 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:10:06 compute-1 nova_compute[183403]: 2026-01-26 15:10:06.633 183407 DEBUG oslo_concurrency.lockutils [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Acquiring lock "refresh_cache-aab8c28e-0489-40bd-88cf-5eb7c419933a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 15:10:06 compute-1 nova_compute[183403]: 2026-01-26 15:10:06.840 183407 WARNING neutronclient.v2_0.client [req-358d53d2-59b7-4c2d-a87a-1be7b70b8e20 req-c6d7f686-8bd3-42e5-b62b-4698747a7d2a 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:10:06 compute-1 sshd-session[205769]: Disconnecting invalid user amir 185.246.128.170 port 4341: Change of username or service not allowed: (amir,ssh-connection) -> (finance,ssh-connection) [preauth]
Jan 26 15:10:06 compute-1 nova_compute[183403]: 2026-01-26 15:10:06.971 183407 DEBUG nova.network.neutron [req-358d53d2-59b7-4c2d-a87a-1be7b70b8e20 req-c6d7f686-8bd3-42e5-b62b-4698747a7d2a 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 15:10:07 compute-1 nova_compute[183403]: 2026-01-26 15:10:07.124 183407 DEBUG nova.network.neutron [req-358d53d2-59b7-4c2d-a87a-1be7b70b8e20 req-c6d7f686-8bd3-42e5-b62b-4698747a7d2a 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:10:07 compute-1 nova_compute[183403]: 2026-01-26 15:10:07.440 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:10:07 compute-1 nova_compute[183403]: 2026-01-26 15:10:07.441 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:10:07 compute-1 nova_compute[183403]: 2026-01-26 15:10:07.442 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:10:07 compute-1 nova_compute[183403]: 2026-01-26 15:10:07.442 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:10:07 compute-1 nova_compute[183403]: 2026-01-26 15:10:07.442 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:10:07 compute-1 nova_compute[183403]: 2026-01-26 15:10:07.442 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 15:10:07 compute-1 nova_compute[183403]: 2026-01-26 15:10:07.953 183407 DEBUG oslo_concurrency.lockutils [req-358d53d2-59b7-4c2d-a87a-1be7b70b8e20 req-c6d7f686-8bd3-42e5-b62b-4698747a7d2a 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Releasing lock "refresh_cache-aab8c28e-0489-40bd-88cf-5eb7c419933a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 15:10:07 compute-1 nova_compute[183403]: 2026-01-26 15:10:07.955 183407 DEBUG oslo_concurrency.lockutils [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Acquired lock "refresh_cache-aab8c28e-0489-40bd-88cf-5eb7c419933a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 15:10:07 compute-1 nova_compute[183403]: 2026-01-26 15:10:07.955 183407 DEBUG nova.network.neutron [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 15:10:08 compute-1 nova_compute[183403]: 2026-01-26 15:10:08.417 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:10:08 compute-1 nova_compute[183403]: 2026-01-26 15:10:08.711 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:10:09 compute-1 nova_compute[183403]: 2026-01-26 15:10:09.083 183407 DEBUG nova.network.neutron [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 15:10:09 compute-1 nova_compute[183403]: 2026-01-26 15:10:09.295 183407 WARNING neutronclient.v2_0.client [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:10:09 compute-1 nova_compute[183403]: 2026-01-26 15:10:09.405 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:10:09 compute-1 nova_compute[183403]: 2026-01-26 15:10:09.494 183407 DEBUG nova.network.neutron [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] Updating instance_info_cache with network_info: [{"id": "beb27c83-cf29-484e-a6e9-e9c6a978afd5", "address": "fa:16:3e:8b:22:83", "network": {"id": "d4a37c9f-5b64-4f94-80e9-126c911b1acf", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7dc5e9070a084dfcb543a08e87868f39", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbeb27c83-cf", "ovs_interfaceid": "beb27c83-cf29-484e-a6e9-e9c6a978afd5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:10:10 compute-1 nova_compute[183403]: 2026-01-26 15:10:10.460 183407 DEBUG oslo_concurrency.lockutils [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Releasing lock "refresh_cache-aab8c28e-0489-40bd-88cf-5eb7c419933a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 15:10:10 compute-1 nova_compute[183403]: 2026-01-26 15:10:10.461 183407 DEBUG nova.compute.manager [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] Instance network_info: |[{"id": "beb27c83-cf29-484e-a6e9-e9c6a978afd5", "address": "fa:16:3e:8b:22:83", "network": {"id": "d4a37c9f-5b64-4f94-80e9-126c911b1acf", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7dc5e9070a084dfcb543a08e87868f39", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbeb27c83-cf", "ovs_interfaceid": "beb27c83-cf29-484e-a6e9-e9c6a978afd5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Jan 26 15:10:10 compute-1 nova_compute[183403]: 2026-01-26 15:10:10.463 183407 DEBUG nova.virt.libvirt.driver [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] Start _get_guest_xml network_info=[{"id": "beb27c83-cf29-484e-a6e9-e9c6a978afd5", "address": "fa:16:3e:8b:22:83", "network": {"id": "d4a37c9f-5b64-4f94-80e9-126c911b1acf", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7dc5e9070a084dfcb543a08e87868f39", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbeb27c83-cf", "ovs_interfaceid": "beb27c83-cf29-484e-a6e9-e9c6a978afd5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T15:01:22Z,direct_url=<?>,disk_format='qcow2',id=354e4d0e-4287-404f-93d3-2c85cfe92fbc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='179f3c996d8f4e7ea1b0aca3ec76f02e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T15:01:23Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'image_id': '354e4d0e-4287-404f-93d3-2c85cfe92fbc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Jan 26 15:10:10 compute-1 nova_compute[183403]: 2026-01-26 15:10:10.466 183407 WARNING nova.virt.libvirt.driver [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:10:10 compute-1 nova_compute[183403]: 2026-01-26 15:10:10.467 183407 DEBUG nova.virt.driver [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='354e4d0e-4287-404f-93d3-2c85cfe92fbc', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteActionsViaActuator-server-368148754', uuid='aab8c28e-0489-40bd-88cf-5eb7c419933a'), owner=OwnerMeta(userid='afb4f4811cb043dca89a8413c390ba3d', username='tempest-TestExecuteActionsViaActuator-280856547-project-admin', projectid='6377892a338d4a7cbe63cf30bd2c63ea', projectname='tempest-TestExecuteActionsViaActuator-280856547'), image=ImageMeta(id='354e4d0e-4287-404f-93d3-2c85cfe92fbc', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='74480e15-23e6-4569-8ef9-3ddf5ac8b981', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "beb27c83-cf29-484e-a6e9-e9c6a978afd5", "address": "fa:16:3e:8b:22:83", "network": {"id": "d4a37c9f-5b64-4f94-80e9-126c911b1acf", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7dc5e9070a084dfcb543a08e87868f39", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbeb27c83-cf", "ovs_interfaceid": "beb27c83-cf29-484e-a6e9-e9c6a978afd5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1769440210.4672897) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Jan 26 15:10:10 compute-1 nova_compute[183403]: 2026-01-26 15:10:10.473 183407 DEBUG nova.virt.libvirt.host [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Jan 26 15:10:10 compute-1 nova_compute[183403]: 2026-01-26 15:10:10.473 183407 DEBUG nova.virt.libvirt.host [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Jan 26 15:10:10 compute-1 nova_compute[183403]: 2026-01-26 15:10:10.477 183407 DEBUG nova.virt.libvirt.host [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Jan 26 15:10:10 compute-1 nova_compute[183403]: 2026-01-26 15:10:10.477 183407 DEBUG nova.virt.libvirt.host [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Jan 26 15:10:10 compute-1 nova_compute[183403]: 2026-01-26 15:10:10.478 183407 DEBUG nova.virt.libvirt.driver [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Jan 26 15:10:10 compute-1 nova_compute[183403]: 2026-01-26 15:10:10.479 183407 DEBUG nova.virt.hardware [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T15:01:21Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='74480e15-23e6-4569-8ef9-3ddf5ac8b981',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T15:01:22Z,direct_url=<?>,disk_format='qcow2',id=354e4d0e-4287-404f-93d3-2c85cfe92fbc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='179f3c996d8f4e7ea1b0aca3ec76f02e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T15:01:23Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Jan 26 15:10:10 compute-1 nova_compute[183403]: 2026-01-26 15:10:10.479 183407 DEBUG nova.virt.hardware [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Jan 26 15:10:10 compute-1 nova_compute[183403]: 2026-01-26 15:10:10.479 183407 DEBUG nova.virt.hardware [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Jan 26 15:10:10 compute-1 nova_compute[183403]: 2026-01-26 15:10:10.479 183407 DEBUG nova.virt.hardware [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Jan 26 15:10:10 compute-1 nova_compute[183403]: 2026-01-26 15:10:10.480 183407 DEBUG nova.virt.hardware [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Jan 26 15:10:10 compute-1 nova_compute[183403]: 2026-01-26 15:10:10.480 183407 DEBUG nova.virt.hardware [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Jan 26 15:10:10 compute-1 nova_compute[183403]: 2026-01-26 15:10:10.480 183407 DEBUG nova.virt.hardware [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Jan 26 15:10:10 compute-1 nova_compute[183403]: 2026-01-26 15:10:10.480 183407 DEBUG nova.virt.hardware [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Jan 26 15:10:10 compute-1 nova_compute[183403]: 2026-01-26 15:10:10.480 183407 DEBUG nova.virt.hardware [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Jan 26 15:10:10 compute-1 nova_compute[183403]: 2026-01-26 15:10:10.480 183407 DEBUG nova.virt.hardware [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Jan 26 15:10:10 compute-1 nova_compute[183403]: 2026-01-26 15:10:10.481 183407 DEBUG nova.virt.hardware [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Jan 26 15:10:10 compute-1 nova_compute[183403]: 2026-01-26 15:10:10.484 183407 DEBUG nova.virt.libvirt.vif [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-01-26T15:09:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-368148754',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-368148754',id=7,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6377892a338d4a7cbe63cf30bd2c63ea',ramdisk_id='',reservation_id='r-nrh6656l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,reader,member,admin',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-280856547',owner_user_name='tempest-TestExecuteActionsViaActuator-280856547-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T15:10:03Z,user_data=None,user_id='afb4f4811cb043dca89a8413c390ba3d',uuid=aab8c28e-0489-40bd-88cf-5eb7c419933a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "beb27c83-cf29-484e-a6e9-e9c6a978afd5", "address": "fa:16:3e:8b:22:83", "network": {"id": "d4a37c9f-5b64-4f94-80e9-126c911b1acf", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7dc5e9070a084dfcb543a08e87868f39", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbeb27c83-cf", "ovs_interfaceid": "beb27c83-cf29-484e-a6e9-e9c6a978afd5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 26 15:10:10 compute-1 nova_compute[183403]: 2026-01-26 15:10:10.484 183407 DEBUG nova.network.os_vif_util [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Converting VIF {"id": "beb27c83-cf29-484e-a6e9-e9c6a978afd5", "address": "fa:16:3e:8b:22:83", "network": {"id": "d4a37c9f-5b64-4f94-80e9-126c911b1acf", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7dc5e9070a084dfcb543a08e87868f39", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbeb27c83-cf", "ovs_interfaceid": "beb27c83-cf29-484e-a6e9-e9c6a978afd5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:10:10 compute-1 nova_compute[183403]: 2026-01-26 15:10:10.485 183407 DEBUG nova.network.os_vif_util [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8b:22:83,bridge_name='br-int',has_traffic_filtering=True,id=beb27c83-cf29-484e-a6e9-e9c6a978afd5,network=Network(d4a37c9f-5b64-4f94-80e9-126c911b1acf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbeb27c83-cf') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:10:10 compute-1 nova_compute[183403]: 2026-01-26 15:10:10.486 183407 DEBUG nova.objects.instance [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lazy-loading 'pci_devices' on Instance uuid aab8c28e-0489-40bd-88cf-5eb7c419933a obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:10:10 compute-1 nova_compute[183403]: 2026-01-26 15:10:10.999 183407 DEBUG nova.virt.libvirt.driver [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] End _get_guest_xml xml=<domain type="kvm">
Jan 26 15:10:10 compute-1 nova_compute[183403]:   <uuid>aab8c28e-0489-40bd-88cf-5eb7c419933a</uuid>
Jan 26 15:10:10 compute-1 nova_compute[183403]:   <name>instance-00000007</name>
Jan 26 15:10:10 compute-1 nova_compute[183403]:   <memory>131072</memory>
Jan 26 15:10:10 compute-1 nova_compute[183403]:   <vcpu>1</vcpu>
Jan 26 15:10:11 compute-1 nova_compute[183403]:   <metadata>
Jan 26 15:10:11 compute-1 nova_compute[183403]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 15:10:11 compute-1 nova_compute[183403]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 15:10:11 compute-1 nova_compute[183403]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-368148754</nova:name>
Jan 26 15:10:11 compute-1 nova_compute[183403]:       <nova:creationTime>2026-01-26 15:10:10</nova:creationTime>
Jan 26 15:10:11 compute-1 nova_compute[183403]:       <nova:flavor name="m1.nano" id="74480e15-23e6-4569-8ef9-3ddf5ac8b981">
Jan 26 15:10:11 compute-1 nova_compute[183403]:         <nova:memory>128</nova:memory>
Jan 26 15:10:11 compute-1 nova_compute[183403]:         <nova:disk>1</nova:disk>
Jan 26 15:10:11 compute-1 nova_compute[183403]:         <nova:swap>0</nova:swap>
Jan 26 15:10:11 compute-1 nova_compute[183403]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 15:10:11 compute-1 nova_compute[183403]:         <nova:vcpus>1</nova:vcpus>
Jan 26 15:10:11 compute-1 nova_compute[183403]:         <nova:extraSpecs>
Jan 26 15:10:11 compute-1 nova_compute[183403]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 15:10:11 compute-1 nova_compute[183403]:         </nova:extraSpecs>
Jan 26 15:10:11 compute-1 nova_compute[183403]:       </nova:flavor>
Jan 26 15:10:11 compute-1 nova_compute[183403]:       <nova:image uuid="354e4d0e-4287-404f-93d3-2c85cfe92fbc">
Jan 26 15:10:11 compute-1 nova_compute[183403]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 15:10:11 compute-1 nova_compute[183403]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 15:10:11 compute-1 nova_compute[183403]:         <nova:minDisk>1</nova:minDisk>
Jan 26 15:10:11 compute-1 nova_compute[183403]:         <nova:minRam>0</nova:minRam>
Jan 26 15:10:11 compute-1 nova_compute[183403]:         <nova:properties>
Jan 26 15:10:11 compute-1 nova_compute[183403]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 15:10:11 compute-1 nova_compute[183403]:         </nova:properties>
Jan 26 15:10:11 compute-1 nova_compute[183403]:       </nova:image>
Jan 26 15:10:11 compute-1 nova_compute[183403]:       <nova:owner>
Jan 26 15:10:11 compute-1 nova_compute[183403]:         <nova:user uuid="afb4f4811cb043dca89a8413c390ba3d">tempest-TestExecuteActionsViaActuator-280856547-project-admin</nova:user>
Jan 26 15:10:11 compute-1 nova_compute[183403]:         <nova:project uuid="6377892a338d4a7cbe63cf30bd2c63ea">tempest-TestExecuteActionsViaActuator-280856547</nova:project>
Jan 26 15:10:11 compute-1 nova_compute[183403]:       </nova:owner>
Jan 26 15:10:11 compute-1 nova_compute[183403]:       <nova:root type="image" uuid="354e4d0e-4287-404f-93d3-2c85cfe92fbc"/>
Jan 26 15:10:11 compute-1 nova_compute[183403]:       <nova:ports>
Jan 26 15:10:11 compute-1 nova_compute[183403]:         <nova:port uuid="beb27c83-cf29-484e-a6e9-e9c6a978afd5">
Jan 26 15:10:11 compute-1 nova_compute[183403]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 26 15:10:11 compute-1 nova_compute[183403]:         </nova:port>
Jan 26 15:10:11 compute-1 nova_compute[183403]:       </nova:ports>
Jan 26 15:10:11 compute-1 nova_compute[183403]:     </nova:instance>
Jan 26 15:10:11 compute-1 nova_compute[183403]:   </metadata>
Jan 26 15:10:11 compute-1 nova_compute[183403]:   <sysinfo type="smbios">
Jan 26 15:10:11 compute-1 nova_compute[183403]:     <system>
Jan 26 15:10:11 compute-1 nova_compute[183403]:       <entry name="manufacturer">RDO</entry>
Jan 26 15:10:11 compute-1 nova_compute[183403]:       <entry name="product">OpenStack Compute</entry>
Jan 26 15:10:11 compute-1 nova_compute[183403]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 15:10:11 compute-1 nova_compute[183403]:       <entry name="serial">aab8c28e-0489-40bd-88cf-5eb7c419933a</entry>
Jan 26 15:10:11 compute-1 nova_compute[183403]:       <entry name="uuid">aab8c28e-0489-40bd-88cf-5eb7c419933a</entry>
Jan 26 15:10:11 compute-1 nova_compute[183403]:       <entry name="family">Virtual Machine</entry>
Jan 26 15:10:11 compute-1 nova_compute[183403]:     </system>
Jan 26 15:10:11 compute-1 nova_compute[183403]:   </sysinfo>
Jan 26 15:10:11 compute-1 nova_compute[183403]:   <os>
Jan 26 15:10:11 compute-1 nova_compute[183403]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 15:10:11 compute-1 nova_compute[183403]:     <boot dev="hd"/>
Jan 26 15:10:11 compute-1 nova_compute[183403]:     <smbios mode="sysinfo"/>
Jan 26 15:10:11 compute-1 nova_compute[183403]:   </os>
Jan 26 15:10:11 compute-1 nova_compute[183403]:   <features>
Jan 26 15:10:11 compute-1 nova_compute[183403]:     <acpi/>
Jan 26 15:10:11 compute-1 nova_compute[183403]:     <apic/>
Jan 26 15:10:11 compute-1 nova_compute[183403]:     <vmcoreinfo/>
Jan 26 15:10:11 compute-1 nova_compute[183403]:   </features>
Jan 26 15:10:11 compute-1 nova_compute[183403]:   <clock offset="utc">
Jan 26 15:10:11 compute-1 nova_compute[183403]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 15:10:11 compute-1 nova_compute[183403]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 15:10:11 compute-1 nova_compute[183403]:     <timer name="hpet" present="no"/>
Jan 26 15:10:11 compute-1 nova_compute[183403]:   </clock>
Jan 26 15:10:11 compute-1 nova_compute[183403]:   <cpu mode="custom" match="exact">
Jan 26 15:10:11 compute-1 nova_compute[183403]:     <model>Nehalem</model>
Jan 26 15:10:11 compute-1 nova_compute[183403]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 15:10:11 compute-1 nova_compute[183403]:   </cpu>
Jan 26 15:10:11 compute-1 nova_compute[183403]:   <devices>
Jan 26 15:10:11 compute-1 nova_compute[183403]:     <disk type="file" device="disk">
Jan 26 15:10:11 compute-1 nova_compute[183403]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 15:10:11 compute-1 nova_compute[183403]:       <source file="/var/lib/nova/instances/aab8c28e-0489-40bd-88cf-5eb7c419933a/disk"/>
Jan 26 15:10:11 compute-1 nova_compute[183403]:       <target dev="vda" bus="virtio"/>
Jan 26 15:10:11 compute-1 nova_compute[183403]:     </disk>
Jan 26 15:10:11 compute-1 nova_compute[183403]:     <disk type="file" device="cdrom">
Jan 26 15:10:11 compute-1 nova_compute[183403]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 15:10:11 compute-1 nova_compute[183403]:       <source file="/var/lib/nova/instances/aab8c28e-0489-40bd-88cf-5eb7c419933a/disk.config"/>
Jan 26 15:10:11 compute-1 nova_compute[183403]:       <target dev="sda" bus="sata"/>
Jan 26 15:10:11 compute-1 nova_compute[183403]:     </disk>
Jan 26 15:10:11 compute-1 nova_compute[183403]:     <interface type="ethernet">
Jan 26 15:10:11 compute-1 nova_compute[183403]:       <mac address="fa:16:3e:8b:22:83"/>
Jan 26 15:10:11 compute-1 nova_compute[183403]:       <model type="virtio"/>
Jan 26 15:10:11 compute-1 nova_compute[183403]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 15:10:11 compute-1 nova_compute[183403]:       <mtu size="1442"/>
Jan 26 15:10:11 compute-1 nova_compute[183403]:       <target dev="tapbeb27c83-cf"/>
Jan 26 15:10:11 compute-1 nova_compute[183403]:     </interface>
Jan 26 15:10:11 compute-1 nova_compute[183403]:     <serial type="pty">
Jan 26 15:10:11 compute-1 nova_compute[183403]:       <log file="/var/lib/nova/instances/aab8c28e-0489-40bd-88cf-5eb7c419933a/console.log" append="off"/>
Jan 26 15:10:11 compute-1 nova_compute[183403]:     </serial>
Jan 26 15:10:11 compute-1 nova_compute[183403]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 15:10:11 compute-1 nova_compute[183403]:     <video>
Jan 26 15:10:11 compute-1 nova_compute[183403]:       <model type="virtio"/>
Jan 26 15:10:11 compute-1 nova_compute[183403]:     </video>
Jan 26 15:10:11 compute-1 nova_compute[183403]:     <input type="tablet" bus="usb"/>
Jan 26 15:10:11 compute-1 nova_compute[183403]:     <rng model="virtio">
Jan 26 15:10:11 compute-1 nova_compute[183403]:       <backend model="random">/dev/urandom</backend>
Jan 26 15:10:11 compute-1 nova_compute[183403]:     </rng>
Jan 26 15:10:11 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root"/>
Jan 26 15:10:11 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:10:11 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:10:11 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:10:11 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:10:11 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:10:11 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:10:11 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:10:11 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:10:11 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:10:11 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:10:11 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:10:11 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:10:11 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:10:11 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:10:11 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:10:11 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:10:11 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:10:11 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:10:11 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:10:11 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:10:11 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:10:11 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:10:11 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:10:11 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:10:11 compute-1 nova_compute[183403]:     <controller type="usb" index="0"/>
Jan 26 15:10:11 compute-1 nova_compute[183403]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 15:10:11 compute-1 nova_compute[183403]:       <stats period="10"/>
Jan 26 15:10:11 compute-1 nova_compute[183403]:     </memballoon>
Jan 26 15:10:11 compute-1 nova_compute[183403]:   </devices>
Jan 26 15:10:11 compute-1 nova_compute[183403]: </domain>
Jan 26 15:10:11 compute-1 nova_compute[183403]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Jan 26 15:10:11 compute-1 nova_compute[183403]: 2026-01-26 15:10:11.001 183407 DEBUG nova.compute.manager [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] Preparing to wait for external event network-vif-plugged-beb27c83-cf29-484e-a6e9-e9c6a978afd5 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 26 15:10:11 compute-1 nova_compute[183403]: 2026-01-26 15:10:11.001 183407 DEBUG oslo_concurrency.lockutils [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Acquiring lock "aab8c28e-0489-40bd-88cf-5eb7c419933a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:10:11 compute-1 nova_compute[183403]: 2026-01-26 15:10:11.001 183407 DEBUG oslo_concurrency.lockutils [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "aab8c28e-0489-40bd-88cf-5eb7c419933a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:10:11 compute-1 nova_compute[183403]: 2026-01-26 15:10:11.002 183407 DEBUG oslo_concurrency.lockutils [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "aab8c28e-0489-40bd-88cf-5eb7c419933a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:10:11 compute-1 nova_compute[183403]: 2026-01-26 15:10:11.003 183407 DEBUG nova.virt.libvirt.vif [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-01-26T15:09:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-368148754',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-368148754',id=7,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6377892a338d4a7cbe63cf30bd2c63ea',ramdisk_id='',reservation_id='r-nrh6656l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,reader,member,admin',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-280856547',owner_user_name='tempest-TestExecuteActionsViaActuator-280856547-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T15:10:03Z,user_data=None,user_id='afb4f4811cb043dca89a8413c390ba3d',uuid=aab8c28e-0489-40bd-88cf-5eb7c419933a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "beb27c83-cf29-484e-a6e9-e9c6a978afd5", "address": "fa:16:3e:8b:22:83", "network": {"id": "d4a37c9f-5b64-4f94-80e9-126c911b1acf", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7dc5e9070a084dfcb543a08e87868f39", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbeb27c83-cf", "ovs_interfaceid": "beb27c83-cf29-484e-a6e9-e9c6a978afd5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 26 15:10:11 compute-1 nova_compute[183403]: 2026-01-26 15:10:11.003 183407 DEBUG nova.network.os_vif_util [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Converting VIF {"id": "beb27c83-cf29-484e-a6e9-e9c6a978afd5", "address": "fa:16:3e:8b:22:83", "network": {"id": "d4a37c9f-5b64-4f94-80e9-126c911b1acf", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7dc5e9070a084dfcb543a08e87868f39", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbeb27c83-cf", "ovs_interfaceid": "beb27c83-cf29-484e-a6e9-e9c6a978afd5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:10:11 compute-1 nova_compute[183403]: 2026-01-26 15:10:11.004 183407 DEBUG nova.network.os_vif_util [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8b:22:83,bridge_name='br-int',has_traffic_filtering=True,id=beb27c83-cf29-484e-a6e9-e9c6a978afd5,network=Network(d4a37c9f-5b64-4f94-80e9-126c911b1acf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbeb27c83-cf') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:10:11 compute-1 nova_compute[183403]: 2026-01-26 15:10:11.004 183407 DEBUG os_vif [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:22:83,bridge_name='br-int',has_traffic_filtering=True,id=beb27c83-cf29-484e-a6e9-e9c6a978afd5,network=Network(d4a37c9f-5b64-4f94-80e9-126c911b1acf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbeb27c83-cf') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 26 15:10:11 compute-1 nova_compute[183403]: 2026-01-26 15:10:11.005 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:10:11 compute-1 nova_compute[183403]: 2026-01-26 15:10:11.006 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:10:11 compute-1 nova_compute[183403]: 2026-01-26 15:10:11.006 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:10:11 compute-1 nova_compute[183403]: 2026-01-26 15:10:11.007 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:10:11 compute-1 nova_compute[183403]: 2026-01-26 15:10:11.007 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'c2341bcc-796b-5d5c-a07d-85affbeca0e1', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:10:11 compute-1 nova_compute[183403]: 2026-01-26 15:10:11.008 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:10:11 compute-1 nova_compute[183403]: 2026-01-26 15:10:11.010 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:10:11 compute-1 nova_compute[183403]: 2026-01-26 15:10:11.016 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:10:11 compute-1 nova_compute[183403]: 2026-01-26 15:10:11.016 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbeb27c83-cf, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:10:11 compute-1 nova_compute[183403]: 2026-01-26 15:10:11.016 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapbeb27c83-cf, col_values=(('qos', UUID('f3ab9540-8705-49df-a846-bccf7b6ee2f1')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:10:11 compute-1 nova_compute[183403]: 2026-01-26 15:10:11.017 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapbeb27c83-cf, col_values=(('external_ids', {'iface-id': 'beb27c83-cf29-484e-a6e9-e9c6a978afd5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8b:22:83', 'vm-uuid': 'aab8c28e-0489-40bd-88cf-5eb7c419933a'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:10:11 compute-1 nova_compute[183403]: 2026-01-26 15:10:11.018 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:10:11 compute-1 NetworkManager[55716]: <info>  [1769440211.0194] manager: (tapbeb27c83-cf): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/30)
Jan 26 15:10:11 compute-1 nova_compute[183403]: 2026-01-26 15:10:11.020 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:10:11 compute-1 nova_compute[183403]: 2026-01-26 15:10:11.024 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:10:11 compute-1 nova_compute[183403]: 2026-01-26 15:10:11.025 183407 INFO os_vif [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:22:83,bridge_name='br-int',has_traffic_filtering=True,id=beb27c83-cf29-484e-a6e9-e9c6a978afd5,network=Network(d4a37c9f-5b64-4f94-80e9-126c911b1acf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbeb27c83-cf')
Jan 26 15:10:13 compute-1 nova_compute[183403]: 2026-01-26 15:10:13.419 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:10:13 compute-1 nova_compute[183403]: 2026-01-26 15:10:13.735 183407 DEBUG nova.virt.libvirt.driver [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 15:10:13 compute-1 nova_compute[183403]: 2026-01-26 15:10:13.736 183407 DEBUG nova.virt.libvirt.driver [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 15:10:13 compute-1 nova_compute[183403]: 2026-01-26 15:10:13.736 183407 DEBUG nova.virt.libvirt.driver [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] No VIF found with MAC fa:16:3e:8b:22:83, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Jan 26 15:10:13 compute-1 nova_compute[183403]: 2026-01-26 15:10:13.736 183407 INFO nova.virt.libvirt.driver [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] Using config drive
Jan 26 15:10:13 compute-1 sshd-session[205849]: Invalid user ubuntu from 103.42.57.146 port 39520
Jan 26 15:10:14 compute-1 sshd-session[205849]: Received disconnect from 103.42.57.146 port 39520:11:  [preauth]
Jan 26 15:10:14 compute-1 sshd-session[205849]: Disconnected from invalid user ubuntu 103.42.57.146 port 39520 [preauth]
Jan 26 15:10:14 compute-1 nova_compute[183403]: 2026-01-26 15:10:14.249 183407 WARNING neutronclient.v2_0.client [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:10:14 compute-1 nova_compute[183403]: 2026-01-26 15:10:14.436 183407 INFO nova.virt.libvirt.driver [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] Creating config drive at /var/lib/nova/instances/aab8c28e-0489-40bd-88cf-5eb7c419933a/disk.config
Jan 26 15:10:14 compute-1 nova_compute[183403]: 2026-01-26 15:10:14.441 183407 DEBUG oslo_concurrency.processutils [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/aab8c28e-0489-40bd-88cf-5eb7c419933a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpl2x4zkzi execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:10:14 compute-1 nova_compute[183403]: 2026-01-26 15:10:14.564 183407 DEBUG oslo_concurrency.processutils [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/aab8c28e-0489-40bd-88cf-5eb7c419933a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpl2x4zkzi" returned: 0 in 0.123s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:10:14 compute-1 kernel: tapbeb27c83-cf: entered promiscuous mode
Jan 26 15:10:14 compute-1 NetworkManager[55716]: <info>  [1769440214.6172] manager: (tapbeb27c83-cf): new Tun device (/org/freedesktop/NetworkManager/Devices/31)
Jan 26 15:10:14 compute-1 ovn_controller[95641]: 2026-01-26T15:10:14Z|00058|binding|INFO|Claiming lport beb27c83-cf29-484e-a6e9-e9c6a978afd5 for this chassis.
Jan 26 15:10:14 compute-1 ovn_controller[95641]: 2026-01-26T15:10:14Z|00059|binding|INFO|beb27c83-cf29-484e-a6e9-e9c6a978afd5: Claiming fa:16:3e:8b:22:83 10.100.0.14
Jan 26 15:10:14 compute-1 nova_compute[183403]: 2026-01-26 15:10:14.618 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:10:14 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:10:14.626 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8b:22:83 10.100.0.14'], port_security=['fa:16:3e:8b:22:83 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'aab8c28e-0489-40bd-88cf-5eb7c419933a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d4a37c9f-5b64-4f94-80e9-126c911b1acf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6377892a338d4a7cbe63cf30bd2c63ea', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6ec487f2-f407-43f7-8fd3-02f4d5e73158', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=922d1c2a-bc46-47ee-81d5-242719303ef7, chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], logical_port=beb27c83-cf29-484e-a6e9-e9c6a978afd5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:10:14 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:10:14.627 104930 INFO neutron.agent.ovn.metadata.agent [-] Port beb27c83-cf29-484e-a6e9-e9c6a978afd5 in datapath d4a37c9f-5b64-4f94-80e9-126c911b1acf bound to our chassis
Jan 26 15:10:14 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:10:14.629 104930 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d4a37c9f-5b64-4f94-80e9-126c911b1acf
Jan 26 15:10:14 compute-1 nova_compute[183403]: 2026-01-26 15:10:14.631 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:10:14 compute-1 ovn_controller[95641]: 2026-01-26T15:10:14Z|00060|binding|INFO|Setting lport beb27c83-cf29-484e-a6e9-e9c6a978afd5 ovn-installed in OVS
Jan 26 15:10:14 compute-1 ovn_controller[95641]: 2026-01-26T15:10:14Z|00061|binding|INFO|Setting lport beb27c83-cf29-484e-a6e9-e9c6a978afd5 up in Southbound
Jan 26 15:10:14 compute-1 nova_compute[183403]: 2026-01-26 15:10:14.633 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:10:14 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:10:14.645 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[9884042c-1d02-441e-ab51-ba6042ea51a0]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:10:14 compute-1 systemd-udevd[205869]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 15:10:14 compute-1 systemd-machined[154697]: New machine qemu-4-instance-00000007.
Jan 26 15:10:14 compute-1 NetworkManager[55716]: <info>  [1769440214.6668] device (tapbeb27c83-cf): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 15:10:14 compute-1 NetworkManager[55716]: <info>  [1769440214.6691] device (tapbeb27c83-cf): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 15:10:14 compute-1 systemd[1]: Started Virtual Machine qemu-4-instance-00000007.
Jan 26 15:10:14 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:10:14.683 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[7ec79be2-c65d-49e0-91de-5006b4cd7f83]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:10:14 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:10:14.686 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[175519c8-0197-4d34-a6d7-f8bb67eb5465]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:10:14 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:10:14.717 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[4ca41673-73ee-4fc1-8760-1888062ceceb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:10:14 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:10:14.733 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[233795ab-1a39-4d01-b36d-b12fdb736117]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd4a37c9f-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:53:55:45'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 385314, 'reachable_time': 28113, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 205882, 'error': None, 'target': 'ovnmeta-d4a37c9f-5b64-4f94-80e9-126c911b1acf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:10:14 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:10:14.750 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[36871f84-16dc-4e8b-8495-b9a141542a72]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd4a37c9f-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 385324, 'tstamp': 385324}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 205884, 'error': None, 'target': 'ovnmeta-d4a37c9f-5b64-4f94-80e9-126c911b1acf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd4a37c9f-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 385327, 'tstamp': 385327}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 205884, 'error': None, 'target': 'ovnmeta-d4a37c9f-5b64-4f94-80e9-126c911b1acf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:10:14 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:10:14.752 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd4a37c9f-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:10:14 compute-1 nova_compute[183403]: 2026-01-26 15:10:14.753 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:10:14 compute-1 nova_compute[183403]: 2026-01-26 15:10:14.754 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:10:14 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:10:14.755 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd4a37c9f-50, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:10:14 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:10:14.755 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:10:14 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:10:14.755 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd4a37c9f-50, col_values=(('external_ids', {'iface-id': '3415b7f1-5b64-48d1-b20f-4c68422efc0e'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:10:14 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:10:14.755 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:10:14 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:10:14.757 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[4e3db7c5-adfe-4412-aef0-67056c0d3d9f]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-d4a37c9f-5b64-4f94-80e9-126c911b1acf\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/d4a37c9f-5b64-4f94-80e9-126c911b1acf.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID d4a37c9f-5b64-4f94-80e9-126c911b1acf\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:10:15 compute-1 nova_compute[183403]: 2026-01-26 15:10:15.209 183407 DEBUG nova.compute.manager [req-30ebcb00-ee6e-4a38-afa7-2c91825ccdd7 req-b18ec200-37dd-4151-95d4-c9a78d558121 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] Received event network-vif-plugged-beb27c83-cf29-484e-a6e9-e9c6a978afd5 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:10:15 compute-1 nova_compute[183403]: 2026-01-26 15:10:15.210 183407 DEBUG oslo_concurrency.lockutils [req-30ebcb00-ee6e-4a38-afa7-2c91825ccdd7 req-b18ec200-37dd-4151-95d4-c9a78d558121 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "aab8c28e-0489-40bd-88cf-5eb7c419933a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:10:15 compute-1 nova_compute[183403]: 2026-01-26 15:10:15.210 183407 DEBUG oslo_concurrency.lockutils [req-30ebcb00-ee6e-4a38-afa7-2c91825ccdd7 req-b18ec200-37dd-4151-95d4-c9a78d558121 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "aab8c28e-0489-40bd-88cf-5eb7c419933a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:10:15 compute-1 nova_compute[183403]: 2026-01-26 15:10:15.210 183407 DEBUG oslo_concurrency.lockutils [req-30ebcb00-ee6e-4a38-afa7-2c91825ccdd7 req-b18ec200-37dd-4151-95d4-c9a78d558121 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "aab8c28e-0489-40bd-88cf-5eb7c419933a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:10:15 compute-1 nova_compute[183403]: 2026-01-26 15:10:15.210 183407 DEBUG nova.compute.manager [req-30ebcb00-ee6e-4a38-afa7-2c91825ccdd7 req-b18ec200-37dd-4151-95d4-c9a78d558121 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] Processing event network-vif-plugged-beb27c83-cf29-484e-a6e9-e9c6a978afd5 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 26 15:10:15 compute-1 nova_compute[183403]: 2026-01-26 15:10:15.211 183407 DEBUG nova.compute.manager [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 26 15:10:15 compute-1 nova_compute[183403]: 2026-01-26 15:10:15.216 183407 DEBUG nova.virt.libvirt.driver [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Jan 26 15:10:15 compute-1 nova_compute[183403]: 2026-01-26 15:10:15.220 183407 INFO nova.virt.libvirt.driver [-] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] Instance spawned successfully.
Jan 26 15:10:15 compute-1 nova_compute[183403]: 2026-01-26 15:10:15.220 183407 DEBUG nova.virt.libvirt.driver [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Jan 26 15:10:15 compute-1 nova_compute[183403]: 2026-01-26 15:10:15.732 183407 DEBUG nova.virt.libvirt.driver [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:10:15 compute-1 nova_compute[183403]: 2026-01-26 15:10:15.733 183407 DEBUG nova.virt.libvirt.driver [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:10:15 compute-1 nova_compute[183403]: 2026-01-26 15:10:15.733 183407 DEBUG nova.virt.libvirt.driver [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:10:15 compute-1 nova_compute[183403]: 2026-01-26 15:10:15.733 183407 DEBUG nova.virt.libvirt.driver [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:10:15 compute-1 nova_compute[183403]: 2026-01-26 15:10:15.734 183407 DEBUG nova.virt.libvirt.driver [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:10:15 compute-1 nova_compute[183403]: 2026-01-26 15:10:15.734 183407 DEBUG nova.virt.libvirt.driver [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:10:16 compute-1 nova_compute[183403]: 2026-01-26 15:10:16.018 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:10:16 compute-1 nova_compute[183403]: 2026-01-26 15:10:16.242 183407 INFO nova.compute.manager [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] Took 11.84 seconds to spawn the instance on the hypervisor.
Jan 26 15:10:16 compute-1 nova_compute[183403]: 2026-01-26 15:10:16.242 183407 DEBUG nova.compute.manager [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Jan 26 15:10:16 compute-1 nova_compute[183403]: 2026-01-26 15:10:16.781 183407 INFO nova.compute.manager [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] Took 17.09 seconds to build instance.
Jan 26 15:10:17 compute-1 nova_compute[183403]: 2026-01-26 15:10:17.288 183407 DEBUG oslo_concurrency.lockutils [None req-54f1fee3-02a6-47db-a7c8-60f45ea4f2d3 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "aab8c28e-0489-40bd-88cf-5eb7c419933a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.825s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:10:17 compute-1 nova_compute[183403]: 2026-01-26 15:10:17.897 183407 DEBUG nova.compute.manager [req-1ed87564-3014-468d-aa00-9e25e36834c4 req-c52beeda-9613-4b1d-b4a4-2c92aa51c3e9 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] Received event network-vif-plugged-beb27c83-cf29-484e-a6e9-e9c6a978afd5 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:10:17 compute-1 nova_compute[183403]: 2026-01-26 15:10:17.898 183407 DEBUG oslo_concurrency.lockutils [req-1ed87564-3014-468d-aa00-9e25e36834c4 req-c52beeda-9613-4b1d-b4a4-2c92aa51c3e9 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "aab8c28e-0489-40bd-88cf-5eb7c419933a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:10:17 compute-1 nova_compute[183403]: 2026-01-26 15:10:17.898 183407 DEBUG oslo_concurrency.lockutils [req-1ed87564-3014-468d-aa00-9e25e36834c4 req-c52beeda-9613-4b1d-b4a4-2c92aa51c3e9 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "aab8c28e-0489-40bd-88cf-5eb7c419933a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:10:17 compute-1 nova_compute[183403]: 2026-01-26 15:10:17.898 183407 DEBUG oslo_concurrency.lockutils [req-1ed87564-3014-468d-aa00-9e25e36834c4 req-c52beeda-9613-4b1d-b4a4-2c92aa51c3e9 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "aab8c28e-0489-40bd-88cf-5eb7c419933a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:10:17 compute-1 nova_compute[183403]: 2026-01-26 15:10:17.898 183407 DEBUG nova.compute.manager [req-1ed87564-3014-468d-aa00-9e25e36834c4 req-c52beeda-9613-4b1d-b4a4-2c92aa51c3e9 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] No waiting events found dispatching network-vif-plugged-beb27c83-cf29-484e-a6e9-e9c6a978afd5 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:10:17 compute-1 nova_compute[183403]: 2026-01-26 15:10:17.898 183407 WARNING nova.compute.manager [req-1ed87564-3014-468d-aa00-9e25e36834c4 req-c52beeda-9613-4b1d-b4a4-2c92aa51c3e9 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] Received unexpected event network-vif-plugged-beb27c83-cf29-484e-a6e9-e9c6a978afd5 for instance with vm_state active and task_state None.
Jan 26 15:10:17 compute-1 podman[205893]: 2026-01-26 15:10:17.904060059 +0000 UTC m=+0.075389807 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 15:10:17 compute-1 podman[205894]: 2026-01-26 15:10:17.918853675 +0000 UTC m=+0.089987577 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vcs-type=git, config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 26 15:10:18 compute-1 nova_compute[183403]: 2026-01-26 15:10:18.421 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:10:19 compute-1 openstack_network_exporter[195610]: ERROR   15:10:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:10:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:10:19 compute-1 openstack_network_exporter[195610]: ERROR   15:10:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:10:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:10:20 compute-1 sshd-session[205846]: Invalid user finance from 185.246.128.170 port 6588
Jan 26 15:10:21 compute-1 nova_compute[183403]: 2026-01-26 15:10:21.019 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:10:21 compute-1 sshd-session[205846]: Disconnecting invalid user finance 185.246.128.170 port 6588: Change of username or service not allowed: (finance,ssh-connection) -> (client,ssh-connection) [preauth]
Jan 26 15:10:23 compute-1 nova_compute[183403]: 2026-01-26 15:10:23.422 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:10:25 compute-1 sshd-session[205938]: Invalid user client from 185.246.128.170 port 52912
Jan 26 15:10:26 compute-1 nova_compute[183403]: 2026-01-26 15:10:26.022 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:10:26 compute-1 sshd-session[205938]: Disconnecting invalid user client 185.246.128.170 port 52912: Change of username or service not allowed: (client,ssh-connection) -> (a,ssh-connection) [preauth]
Jan 26 15:10:27 compute-1 ovn_controller[95641]: 2026-01-26T15:10:27Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:8b:22:83 10.100.0.14
Jan 26 15:10:27 compute-1 ovn_controller[95641]: 2026-01-26T15:10:27Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:8b:22:83 10.100.0.14
Jan 26 15:10:28 compute-1 nova_compute[183403]: 2026-01-26 15:10:28.424 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:10:28 compute-1 sshd-session[205956]: Invalid user a from 185.246.128.170 port 28594
Jan 26 15:10:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:10:29.032 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:10:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:10:29.032 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:10:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:10:29.034 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:10:29 compute-1 podman[205960]: 2026-01-26 15:10:29.929615086 +0000 UTC m=+0.068854620 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 26 15:10:29 compute-1 podman[205959]: 2026-01-26 15:10:29.956254336 +0000 UTC m=+0.129449670 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 26 15:10:31 compute-1 nova_compute[183403]: 2026-01-26 15:10:31.022 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:10:31 compute-1 sshd-session[205956]: Disconnecting invalid user a 185.246.128.170 port 28594: Change of username or service not allowed: (a,ssh-connection) -> (vali,ssh-connection) [preauth]
Jan 26 15:10:34 compute-1 nova_compute[183403]: 2026-01-26 15:10:34.132 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:10:34 compute-1 sshd-session[206003]: Invalid user vali from 185.246.128.170 port 33788
Jan 26 15:10:35 compute-1 sshd-session[206003]: Disconnecting invalid user vali 185.246.128.170 port 33788: Change of username or service not allowed: (vali,ssh-connection) -> (oracle,ssh-connection) [preauth]
Jan 26 15:10:35 compute-1 podman[192725]: time="2026-01-26T15:10:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:10:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:10:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16574 "" "Go-http-client/1.1"
Jan 26 15:10:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:10:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2631 "" "Go-http-client/1.1"
Jan 26 15:10:36 compute-1 nova_compute[183403]: 2026-01-26 15:10:36.025 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:10:39 compute-1 nova_compute[183403]: 2026-01-26 15:10:39.136 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:10:41 compute-1 nova_compute[183403]: 2026-01-26 15:10:41.026 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:10:41 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:10:41.266 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:ac:38', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'de:96:40:90:f3:e5'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:10:41 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:10:41.266 104930 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 15:10:41 compute-1 nova_compute[183403]: 2026-01-26 15:10:41.267 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:10:44 compute-1 nova_compute[183403]: 2026-01-26 15:10:44.170 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:10:46 compute-1 nova_compute[183403]: 2026-01-26 15:10:46.028 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:10:46 compute-1 sshd-session[206005]: Invalid user oracle from 185.246.128.170 port 5168
Jan 26 15:10:47 compute-1 nova_compute[183403]: 2026-01-26 15:10:47.600 183407 DEBUG oslo_concurrency.lockutils [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Acquiring lock "981e6db3-c4e9-422b-91bb-a2c1c5869fc4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:10:47 compute-1 nova_compute[183403]: 2026-01-26 15:10:47.600 183407 DEBUG oslo_concurrency.lockutils [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "981e6db3-c4e9-422b-91bb-a2c1c5869fc4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:10:48 compute-1 nova_compute[183403]: 2026-01-26 15:10:48.105 183407 DEBUG nova.compute.manager [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Jan 26 15:10:48 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:10:48.267 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=41380b5a-e321-4ce4-bcc6-ecd563b3c793, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:10:48 compute-1 nova_compute[183403]: 2026-01-26 15:10:48.660 183407 DEBUG oslo_concurrency.lockutils [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:10:48 compute-1 nova_compute[183403]: 2026-01-26 15:10:48.660 183407 DEBUG oslo_concurrency.lockutils [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:10:48 compute-1 nova_compute[183403]: 2026-01-26 15:10:48.666 183407 DEBUG nova.virt.hardware [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Jan 26 15:10:48 compute-1 nova_compute[183403]: 2026-01-26 15:10:48.666 183407 INFO nova.compute.claims [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Claim successful on node compute-1.ctlplane.example.com
Jan 26 15:10:48 compute-1 podman[206021]: 2026-01-26 15:10:48.919742386 +0000 UTC m=+0.076759589 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, maintainer=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, architecture=x86_64, managed_by=edpm_ansible, release=1755695350, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 26 15:10:48 compute-1 podman[206020]: 2026-01-26 15:10:48.93760961 +0000 UTC m=+0.066602660 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 15:10:49 compute-1 nova_compute[183403]: 2026-01-26 15:10:49.230 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:10:49 compute-1 openstack_network_exporter[195610]: ERROR   15:10:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:10:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:10:49 compute-1 openstack_network_exporter[195610]: ERROR   15:10:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:10:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:10:49 compute-1 nova_compute[183403]: 2026-01-26 15:10:49.797 183407 DEBUG nova.compute.provider_tree [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:10:50 compute-1 nova_compute[183403]: 2026-01-26 15:10:50.304 183407 DEBUG nova.scheduler.client.report [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:10:50 compute-1 nova_compute[183403]: 2026-01-26 15:10:50.812 183407 DEBUG oslo_concurrency.lockutils [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.152s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:10:50 compute-1 nova_compute[183403]: 2026-01-26 15:10:50.813 183407 DEBUG nova.compute.manager [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Jan 26 15:10:51 compute-1 nova_compute[183403]: 2026-01-26 15:10:51.030 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:10:51 compute-1 sshd-session[206005]: error: maximum authentication attempts exceeded for invalid user oracle from 185.246.128.170 port 5168 ssh2 [preauth]
Jan 26 15:10:51 compute-1 sshd-session[206005]: Disconnecting invalid user oracle 185.246.128.170 port 5168: Too many authentication failures [preauth]
Jan 26 15:10:51 compute-1 nova_compute[183403]: 2026-01-26 15:10:51.324 183407 DEBUG nova.compute.manager [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Jan 26 15:10:51 compute-1 nova_compute[183403]: 2026-01-26 15:10:51.325 183407 DEBUG nova.network.neutron [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Jan 26 15:10:51 compute-1 nova_compute[183403]: 2026-01-26 15:10:51.325 183407 WARNING neutronclient.v2_0.client [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:10:51 compute-1 nova_compute[183403]: 2026-01-26 15:10:51.326 183407 WARNING neutronclient.v2_0.client [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:10:51 compute-1 nova_compute[183403]: 2026-01-26 15:10:51.832 183407 INFO nova.virt.libvirt.driver [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 15:10:52 compute-1 nova_compute[183403]: 2026-01-26 15:10:52.340 183407 DEBUG nova.compute.manager [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Jan 26 15:10:53 compute-1 nova_compute[183403]: 2026-01-26 15:10:53.039 183407 DEBUG nova.network.neutron [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Successfully created port: b50fc69b-cfde-429d-908f-cde6f56037bf _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Jan 26 15:10:53 compute-1 nova_compute[183403]: 2026-01-26 15:10:53.856 183407 DEBUG nova.network.neutron [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Successfully updated port: b50fc69b-cfde-429d-908f-cde6f56037bf _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Jan 26 15:10:53 compute-1 nova_compute[183403]: 2026-01-26 15:10:53.861 183407 DEBUG nova.compute.manager [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Jan 26 15:10:53 compute-1 nova_compute[183403]: 2026-01-26 15:10:53.863 183407 DEBUG nova.virt.libvirt.driver [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Jan 26 15:10:53 compute-1 nova_compute[183403]: 2026-01-26 15:10:53.864 183407 INFO nova.virt.libvirt.driver [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Creating image(s)
Jan 26 15:10:53 compute-1 nova_compute[183403]: 2026-01-26 15:10:53.865 183407 DEBUG oslo_concurrency.lockutils [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Acquiring lock "/var/lib/nova/instances/981e6db3-c4e9-422b-91bb-a2c1c5869fc4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:10:53 compute-1 nova_compute[183403]: 2026-01-26 15:10:53.865 183407 DEBUG oslo_concurrency.lockutils [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "/var/lib/nova/instances/981e6db3-c4e9-422b-91bb-a2c1c5869fc4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:10:53 compute-1 nova_compute[183403]: 2026-01-26 15:10:53.866 183407 DEBUG oslo_concurrency.lockutils [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "/var/lib/nova/instances/981e6db3-c4e9-422b-91bb-a2c1c5869fc4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:10:53 compute-1 nova_compute[183403]: 2026-01-26 15:10:53.867 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:10:53 compute-1 nova_compute[183403]: 2026-01-26 15:10:53.874 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:10:53 compute-1 nova_compute[183403]: 2026-01-26 15:10:53.876 183407 DEBUG oslo_concurrency.lockutils [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Acquiring lock "refresh_cache-981e6db3-c4e9-422b-91bb-a2c1c5869fc4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 15:10:53 compute-1 nova_compute[183403]: 2026-01-26 15:10:53.876 183407 DEBUG oslo_concurrency.lockutils [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Acquired lock "refresh_cache-981e6db3-c4e9-422b-91bb-a2c1c5869fc4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 15:10:53 compute-1 nova_compute[183403]: 2026-01-26 15:10:53.877 183407 DEBUG nova.network.neutron [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 15:10:53 compute-1 nova_compute[183403]: 2026-01-26 15:10:53.879 183407 DEBUG oslo_concurrency.processutils [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:10:53 compute-1 nova_compute[183403]: 2026-01-26 15:10:53.904 183407 DEBUG nova.compute.manager [req-9fc6b6a5-2d7d-4565-9ce0-51a6322d1bc6 req-36dc021f-b1b5-4d49-b0ee-f7e58b4c3e2e 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Received event network-changed-b50fc69b-cfde-429d-908f-cde6f56037bf external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:10:53 compute-1 nova_compute[183403]: 2026-01-26 15:10:53.905 183407 DEBUG nova.compute.manager [req-9fc6b6a5-2d7d-4565-9ce0-51a6322d1bc6 req-36dc021f-b1b5-4d49-b0ee-f7e58b4c3e2e 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Refreshing instance network info cache due to event network-changed-b50fc69b-cfde-429d-908f-cde6f56037bf. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 26 15:10:53 compute-1 nova_compute[183403]: 2026-01-26 15:10:53.905 183407 DEBUG oslo_concurrency.lockutils [req-9fc6b6a5-2d7d-4565-9ce0-51a6322d1bc6 req-36dc021f-b1b5-4d49-b0ee-f7e58b4c3e2e 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "refresh_cache-981e6db3-c4e9-422b-91bb-a2c1c5869fc4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 15:10:53 compute-1 nova_compute[183403]: 2026-01-26 15:10:53.974 183407 DEBUG oslo_concurrency.processutils [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:10:53 compute-1 nova_compute[183403]: 2026-01-26 15:10:53.974 183407 DEBUG oslo_concurrency.lockutils [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Acquiring lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:10:53 compute-1 nova_compute[183403]: 2026-01-26 15:10:53.975 183407 DEBUG oslo_concurrency.lockutils [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:10:53 compute-1 nova_compute[183403]: 2026-01-26 15:10:53.975 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:10:53 compute-1 nova_compute[183403]: 2026-01-26 15:10:53.978 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:10:53 compute-1 nova_compute[183403]: 2026-01-26 15:10:53.979 183407 DEBUG oslo_concurrency.processutils [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:10:54 compute-1 nova_compute[183403]: 2026-01-26 15:10:54.030 183407 DEBUG oslo_concurrency.processutils [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:10:54 compute-1 nova_compute[183403]: 2026-01-26 15:10:54.031 183407 DEBUG oslo_concurrency.processutils [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0,backing_fmt=raw /var/lib/nova/instances/981e6db3-c4e9-422b-91bb-a2c1c5869fc4/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:10:54 compute-1 nova_compute[183403]: 2026-01-26 15:10:54.232 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:10:54 compute-1 nova_compute[183403]: 2026-01-26 15:10:54.249 183407 DEBUG oslo_concurrency.processutils [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0,backing_fmt=raw /var/lib/nova/instances/981e6db3-c4e9-422b-91bb-a2c1c5869fc4/disk 1073741824" returned: 0 in 0.218s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:10:54 compute-1 nova_compute[183403]: 2026-01-26 15:10:54.250 183407 DEBUG oslo_concurrency.lockutils [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.275s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:10:54 compute-1 nova_compute[183403]: 2026-01-26 15:10:54.251 183407 DEBUG oslo_concurrency.processutils [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:10:54 compute-1 nova_compute[183403]: 2026-01-26 15:10:54.306 183407 DEBUG oslo_concurrency.processutils [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:10:54 compute-1 nova_compute[183403]: 2026-01-26 15:10:54.307 183407 DEBUG nova.virt.disk.api [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Checking if we can resize image /var/lib/nova/instances/981e6db3-c4e9-422b-91bb-a2c1c5869fc4/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 26 15:10:54 compute-1 nova_compute[183403]: 2026-01-26 15:10:54.308 183407 DEBUG oslo_concurrency.processutils [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/981e6db3-c4e9-422b-91bb-a2c1c5869fc4/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:10:54 compute-1 nova_compute[183403]: 2026-01-26 15:10:54.369 183407 DEBUG oslo_concurrency.processutils [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/981e6db3-c4e9-422b-91bb-a2c1c5869fc4/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:10:54 compute-1 nova_compute[183403]: 2026-01-26 15:10:54.370 183407 DEBUG nova.virt.disk.api [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Cannot resize image /var/lib/nova/instances/981e6db3-c4e9-422b-91bb-a2c1c5869fc4/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 26 15:10:54 compute-1 nova_compute[183403]: 2026-01-26 15:10:54.370 183407 DEBUG nova.virt.libvirt.driver [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Jan 26 15:10:54 compute-1 nova_compute[183403]: 2026-01-26 15:10:54.370 183407 DEBUG nova.virt.libvirt.driver [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Ensure instance console log exists: /var/lib/nova/instances/981e6db3-c4e9-422b-91bb-a2c1c5869fc4/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Jan 26 15:10:54 compute-1 nova_compute[183403]: 2026-01-26 15:10:54.371 183407 DEBUG oslo_concurrency.lockutils [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:10:54 compute-1 nova_compute[183403]: 2026-01-26 15:10:54.371 183407 DEBUG oslo_concurrency.lockutils [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:10:54 compute-1 nova_compute[183403]: 2026-01-26 15:10:54.372 183407 DEBUG oslo_concurrency.lockutils [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:10:55 compute-1 nova_compute[183403]: 2026-01-26 15:10:55.086 183407 DEBUG nova.network.neutron [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 15:10:56 compute-1 nova_compute[183403]: 2026-01-26 15:10:56.031 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:10:56 compute-1 nova_compute[183403]: 2026-01-26 15:10:56.130 183407 WARNING neutronclient.v2_0.client [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:10:56 compute-1 nova_compute[183403]: 2026-01-26 15:10:56.628 183407 DEBUG nova.network.neutron [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Updating instance_info_cache with network_info: [{"id": "b50fc69b-cfde-429d-908f-cde6f56037bf", "address": "fa:16:3e:7f:ac:22", "network": {"id": "d4a37c9f-5b64-4f94-80e9-126c911b1acf", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7dc5e9070a084dfcb543a08e87868f39", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb50fc69b-cf", "ovs_interfaceid": "b50fc69b-cfde-429d-908f-cde6f56037bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:10:57 compute-1 nova_compute[183403]: 2026-01-26 15:10:57.137 183407 DEBUG oslo_concurrency.lockutils [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Releasing lock "refresh_cache-981e6db3-c4e9-422b-91bb-a2c1c5869fc4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 15:10:57 compute-1 nova_compute[183403]: 2026-01-26 15:10:57.138 183407 DEBUG nova.compute.manager [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Instance network_info: |[{"id": "b50fc69b-cfde-429d-908f-cde6f56037bf", "address": "fa:16:3e:7f:ac:22", "network": {"id": "d4a37c9f-5b64-4f94-80e9-126c911b1acf", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7dc5e9070a084dfcb543a08e87868f39", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb50fc69b-cf", "ovs_interfaceid": "b50fc69b-cfde-429d-908f-cde6f56037bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Jan 26 15:10:57 compute-1 nova_compute[183403]: 2026-01-26 15:10:57.138 183407 DEBUG oslo_concurrency.lockutils [req-9fc6b6a5-2d7d-4565-9ce0-51a6322d1bc6 req-36dc021f-b1b5-4d49-b0ee-f7e58b4c3e2e 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquired lock "refresh_cache-981e6db3-c4e9-422b-91bb-a2c1c5869fc4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 15:10:57 compute-1 nova_compute[183403]: 2026-01-26 15:10:57.138 183407 DEBUG nova.network.neutron [req-9fc6b6a5-2d7d-4565-9ce0-51a6322d1bc6 req-36dc021f-b1b5-4d49-b0ee-f7e58b4c3e2e 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Refreshing network info cache for port b50fc69b-cfde-429d-908f-cde6f56037bf _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 26 15:10:57 compute-1 nova_compute[183403]: 2026-01-26 15:10:57.141 183407 DEBUG nova.virt.libvirt.driver [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Start _get_guest_xml network_info=[{"id": "b50fc69b-cfde-429d-908f-cde6f56037bf", "address": "fa:16:3e:7f:ac:22", "network": {"id": "d4a37c9f-5b64-4f94-80e9-126c911b1acf", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7dc5e9070a084dfcb543a08e87868f39", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb50fc69b-cf", "ovs_interfaceid": "b50fc69b-cfde-429d-908f-cde6f56037bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T15:01:22Z,direct_url=<?>,disk_format='qcow2',id=354e4d0e-4287-404f-93d3-2c85cfe92fbc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='179f3c996d8f4e7ea1b0aca3ec76f02e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T15:01:23Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'image_id': '354e4d0e-4287-404f-93d3-2c85cfe92fbc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Jan 26 15:10:57 compute-1 nova_compute[183403]: 2026-01-26 15:10:57.147 183407 WARNING nova.virt.libvirt.driver [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:10:57 compute-1 nova_compute[183403]: 2026-01-26 15:10:57.148 183407 DEBUG nova.virt.driver [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='354e4d0e-4287-404f-93d3-2c85cfe92fbc', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteActionsViaActuator-server-855077368', uuid='981e6db3-c4e9-422b-91bb-a2c1c5869fc4'), owner=OwnerMeta(userid='afb4f4811cb043dca89a8413c390ba3d', username='tempest-TestExecuteActionsViaActuator-280856547-project-admin', projectid='6377892a338d4a7cbe63cf30bd2c63ea', projectname='tempest-TestExecuteActionsViaActuator-280856547'), image=ImageMeta(id='354e4d0e-4287-404f-93d3-2c85cfe92fbc', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='74480e15-23e6-4569-8ef9-3ddf5ac8b981', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "b50fc69b-cfde-429d-908f-cde6f56037bf", "address": "fa:16:3e:7f:ac:22", "network": {"id": "d4a37c9f-5b64-4f94-80e9-126c911b1acf", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7dc5e9070a084dfcb543a08e87868f39", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb50fc69b-cf", "ovs_interfaceid": "b50fc69b-cfde-429d-908f-cde6f56037bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1769440257.1482909) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Jan 26 15:10:57 compute-1 nova_compute[183403]: 2026-01-26 15:10:57.157 183407 DEBUG nova.virt.libvirt.host [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Jan 26 15:10:57 compute-1 nova_compute[183403]: 2026-01-26 15:10:57.158 183407 DEBUG nova.virt.libvirt.host [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Jan 26 15:10:57 compute-1 nova_compute[183403]: 2026-01-26 15:10:57.162 183407 DEBUG nova.virt.libvirt.host [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Jan 26 15:10:57 compute-1 nova_compute[183403]: 2026-01-26 15:10:57.162 183407 DEBUG nova.virt.libvirt.host [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Jan 26 15:10:57 compute-1 nova_compute[183403]: 2026-01-26 15:10:57.164 183407 DEBUG nova.virt.libvirt.driver [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Jan 26 15:10:57 compute-1 nova_compute[183403]: 2026-01-26 15:10:57.164 183407 DEBUG nova.virt.hardware [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T15:01:21Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='74480e15-23e6-4569-8ef9-3ddf5ac8b981',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T15:01:22Z,direct_url=<?>,disk_format='qcow2',id=354e4d0e-4287-404f-93d3-2c85cfe92fbc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='179f3c996d8f4e7ea1b0aca3ec76f02e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T15:01:23Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Jan 26 15:10:57 compute-1 nova_compute[183403]: 2026-01-26 15:10:57.164 183407 DEBUG nova.virt.hardware [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Jan 26 15:10:57 compute-1 nova_compute[183403]: 2026-01-26 15:10:57.165 183407 DEBUG nova.virt.hardware [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Jan 26 15:10:57 compute-1 nova_compute[183403]: 2026-01-26 15:10:57.165 183407 DEBUG nova.virt.hardware [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Jan 26 15:10:57 compute-1 nova_compute[183403]: 2026-01-26 15:10:57.165 183407 DEBUG nova.virt.hardware [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Jan 26 15:10:57 compute-1 nova_compute[183403]: 2026-01-26 15:10:57.165 183407 DEBUG nova.virt.hardware [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Jan 26 15:10:57 compute-1 nova_compute[183403]: 2026-01-26 15:10:57.165 183407 DEBUG nova.virt.hardware [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Jan 26 15:10:57 compute-1 nova_compute[183403]: 2026-01-26 15:10:57.166 183407 DEBUG nova.virt.hardware [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Jan 26 15:10:57 compute-1 nova_compute[183403]: 2026-01-26 15:10:57.166 183407 DEBUG nova.virt.hardware [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Jan 26 15:10:57 compute-1 nova_compute[183403]: 2026-01-26 15:10:57.166 183407 DEBUG nova.virt.hardware [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Jan 26 15:10:57 compute-1 nova_compute[183403]: 2026-01-26 15:10:57.166 183407 DEBUG nova.virt.hardware [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Jan 26 15:10:57 compute-1 nova_compute[183403]: 2026-01-26 15:10:57.170 183407 DEBUG nova.virt.libvirt.vif [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-01-26T15:10:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-855077368',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-855077368',id=9,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6377892a338d4a7cbe63cf30bd2c63ea',ramdisk_id='',reservation_id='r-mka0m05y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,reader,member,admin',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-280856547',owner_user_name='tempest-TestExecuteActionsViaActuator-280856547-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T15:10:52Z,user_data=None,user_id='afb4f4811cb043dca89a8413c390ba3d',uuid=981e6db3-c4e9-422b-91bb-a2c1c5869fc4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b50fc69b-cfde-429d-908f-cde6f56037bf", "address": "fa:16:3e:7f:ac:22", "network": {"id": "d4a37c9f-5b64-4f94-80e9-126c911b1acf", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7dc5e9070a084dfcb543a08e87868f39", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb50fc69b-cf", "ovs_interfaceid": "b50fc69b-cfde-429d-908f-cde6f56037bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 26 15:10:57 compute-1 nova_compute[183403]: 2026-01-26 15:10:57.171 183407 DEBUG nova.network.os_vif_util [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Converting VIF {"id": "b50fc69b-cfde-429d-908f-cde6f56037bf", "address": "fa:16:3e:7f:ac:22", "network": {"id": "d4a37c9f-5b64-4f94-80e9-126c911b1acf", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7dc5e9070a084dfcb543a08e87868f39", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb50fc69b-cf", "ovs_interfaceid": "b50fc69b-cfde-429d-908f-cde6f56037bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:10:57 compute-1 nova_compute[183403]: 2026-01-26 15:10:57.171 183407 DEBUG nova.network.os_vif_util [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7f:ac:22,bridge_name='br-int',has_traffic_filtering=True,id=b50fc69b-cfde-429d-908f-cde6f56037bf,network=Network(d4a37c9f-5b64-4f94-80e9-126c911b1acf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb50fc69b-cf') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:10:57 compute-1 nova_compute[183403]: 2026-01-26 15:10:57.172 183407 DEBUG nova.objects.instance [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lazy-loading 'pci_devices' on Instance uuid 981e6db3-c4e9-422b-91bb-a2c1c5869fc4 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:10:57 compute-1 nova_compute[183403]: 2026-01-26 15:10:57.653 183407 WARNING neutronclient.v2_0.client [req-9fc6b6a5-2d7d-4565-9ce0-51a6322d1bc6 req-36dc021f-b1b5-4d49-b0ee-f7e58b4c3e2e 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:10:57 compute-1 nova_compute[183403]: 2026-01-26 15:10:57.684 183407 DEBUG nova.virt.libvirt.driver [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] End _get_guest_xml xml=<domain type="kvm">
Jan 26 15:10:57 compute-1 nova_compute[183403]:   <uuid>981e6db3-c4e9-422b-91bb-a2c1c5869fc4</uuid>
Jan 26 15:10:57 compute-1 nova_compute[183403]:   <name>instance-00000009</name>
Jan 26 15:10:57 compute-1 nova_compute[183403]:   <memory>131072</memory>
Jan 26 15:10:57 compute-1 nova_compute[183403]:   <vcpu>1</vcpu>
Jan 26 15:10:57 compute-1 nova_compute[183403]:   <metadata>
Jan 26 15:10:57 compute-1 nova_compute[183403]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 15:10:57 compute-1 nova_compute[183403]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 15:10:57 compute-1 nova_compute[183403]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-855077368</nova:name>
Jan 26 15:10:57 compute-1 nova_compute[183403]:       <nova:creationTime>2026-01-26 15:10:57</nova:creationTime>
Jan 26 15:10:57 compute-1 nova_compute[183403]:       <nova:flavor name="m1.nano" id="74480e15-23e6-4569-8ef9-3ddf5ac8b981">
Jan 26 15:10:57 compute-1 nova_compute[183403]:         <nova:memory>128</nova:memory>
Jan 26 15:10:57 compute-1 nova_compute[183403]:         <nova:disk>1</nova:disk>
Jan 26 15:10:57 compute-1 nova_compute[183403]:         <nova:swap>0</nova:swap>
Jan 26 15:10:57 compute-1 nova_compute[183403]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 15:10:57 compute-1 nova_compute[183403]:         <nova:vcpus>1</nova:vcpus>
Jan 26 15:10:57 compute-1 nova_compute[183403]:         <nova:extraSpecs>
Jan 26 15:10:57 compute-1 nova_compute[183403]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 15:10:57 compute-1 nova_compute[183403]:         </nova:extraSpecs>
Jan 26 15:10:57 compute-1 nova_compute[183403]:       </nova:flavor>
Jan 26 15:10:57 compute-1 nova_compute[183403]:       <nova:image uuid="354e4d0e-4287-404f-93d3-2c85cfe92fbc">
Jan 26 15:10:57 compute-1 nova_compute[183403]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 15:10:57 compute-1 nova_compute[183403]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 15:10:57 compute-1 nova_compute[183403]:         <nova:minDisk>1</nova:minDisk>
Jan 26 15:10:57 compute-1 nova_compute[183403]:         <nova:minRam>0</nova:minRam>
Jan 26 15:10:57 compute-1 nova_compute[183403]:         <nova:properties>
Jan 26 15:10:57 compute-1 nova_compute[183403]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 15:10:57 compute-1 nova_compute[183403]:         </nova:properties>
Jan 26 15:10:57 compute-1 nova_compute[183403]:       </nova:image>
Jan 26 15:10:57 compute-1 nova_compute[183403]:       <nova:owner>
Jan 26 15:10:57 compute-1 nova_compute[183403]:         <nova:user uuid="afb4f4811cb043dca89a8413c390ba3d">tempest-TestExecuteActionsViaActuator-280856547-project-admin</nova:user>
Jan 26 15:10:57 compute-1 nova_compute[183403]:         <nova:project uuid="6377892a338d4a7cbe63cf30bd2c63ea">tempest-TestExecuteActionsViaActuator-280856547</nova:project>
Jan 26 15:10:57 compute-1 nova_compute[183403]:       </nova:owner>
Jan 26 15:10:57 compute-1 nova_compute[183403]:       <nova:root type="image" uuid="354e4d0e-4287-404f-93d3-2c85cfe92fbc"/>
Jan 26 15:10:57 compute-1 nova_compute[183403]:       <nova:ports>
Jan 26 15:10:57 compute-1 nova_compute[183403]:         <nova:port uuid="b50fc69b-cfde-429d-908f-cde6f56037bf">
Jan 26 15:10:57 compute-1 nova_compute[183403]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 26 15:10:57 compute-1 nova_compute[183403]:         </nova:port>
Jan 26 15:10:57 compute-1 nova_compute[183403]:       </nova:ports>
Jan 26 15:10:57 compute-1 nova_compute[183403]:     </nova:instance>
Jan 26 15:10:57 compute-1 nova_compute[183403]:   </metadata>
Jan 26 15:10:57 compute-1 nova_compute[183403]:   <sysinfo type="smbios">
Jan 26 15:10:57 compute-1 nova_compute[183403]:     <system>
Jan 26 15:10:57 compute-1 nova_compute[183403]:       <entry name="manufacturer">RDO</entry>
Jan 26 15:10:57 compute-1 nova_compute[183403]:       <entry name="product">OpenStack Compute</entry>
Jan 26 15:10:57 compute-1 nova_compute[183403]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 15:10:57 compute-1 nova_compute[183403]:       <entry name="serial">981e6db3-c4e9-422b-91bb-a2c1c5869fc4</entry>
Jan 26 15:10:57 compute-1 nova_compute[183403]:       <entry name="uuid">981e6db3-c4e9-422b-91bb-a2c1c5869fc4</entry>
Jan 26 15:10:57 compute-1 nova_compute[183403]:       <entry name="family">Virtual Machine</entry>
Jan 26 15:10:57 compute-1 nova_compute[183403]:     </system>
Jan 26 15:10:57 compute-1 nova_compute[183403]:   </sysinfo>
Jan 26 15:10:57 compute-1 nova_compute[183403]:   <os>
Jan 26 15:10:57 compute-1 nova_compute[183403]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 15:10:57 compute-1 nova_compute[183403]:     <boot dev="hd"/>
Jan 26 15:10:57 compute-1 nova_compute[183403]:     <smbios mode="sysinfo"/>
Jan 26 15:10:57 compute-1 nova_compute[183403]:   </os>
Jan 26 15:10:57 compute-1 nova_compute[183403]:   <features>
Jan 26 15:10:57 compute-1 nova_compute[183403]:     <acpi/>
Jan 26 15:10:57 compute-1 nova_compute[183403]:     <apic/>
Jan 26 15:10:57 compute-1 nova_compute[183403]:     <vmcoreinfo/>
Jan 26 15:10:57 compute-1 nova_compute[183403]:   </features>
Jan 26 15:10:57 compute-1 nova_compute[183403]:   <clock offset="utc">
Jan 26 15:10:57 compute-1 nova_compute[183403]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 15:10:57 compute-1 nova_compute[183403]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 15:10:57 compute-1 nova_compute[183403]:     <timer name="hpet" present="no"/>
Jan 26 15:10:57 compute-1 nova_compute[183403]:   </clock>
Jan 26 15:10:57 compute-1 nova_compute[183403]:   <cpu mode="custom" match="exact">
Jan 26 15:10:57 compute-1 nova_compute[183403]:     <model>Nehalem</model>
Jan 26 15:10:57 compute-1 nova_compute[183403]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 15:10:57 compute-1 nova_compute[183403]:   </cpu>
Jan 26 15:10:57 compute-1 nova_compute[183403]:   <devices>
Jan 26 15:10:57 compute-1 nova_compute[183403]:     <disk type="file" device="disk">
Jan 26 15:10:57 compute-1 nova_compute[183403]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 15:10:57 compute-1 nova_compute[183403]:       <source file="/var/lib/nova/instances/981e6db3-c4e9-422b-91bb-a2c1c5869fc4/disk"/>
Jan 26 15:10:57 compute-1 nova_compute[183403]:       <target dev="vda" bus="virtio"/>
Jan 26 15:10:57 compute-1 nova_compute[183403]:     </disk>
Jan 26 15:10:57 compute-1 nova_compute[183403]:     <disk type="file" device="cdrom">
Jan 26 15:10:57 compute-1 nova_compute[183403]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 15:10:57 compute-1 nova_compute[183403]:       <source file="/var/lib/nova/instances/981e6db3-c4e9-422b-91bb-a2c1c5869fc4/disk.config"/>
Jan 26 15:10:57 compute-1 nova_compute[183403]:       <target dev="sda" bus="sata"/>
Jan 26 15:10:57 compute-1 nova_compute[183403]:     </disk>
Jan 26 15:10:57 compute-1 nova_compute[183403]:     <interface type="ethernet">
Jan 26 15:10:57 compute-1 nova_compute[183403]:       <mac address="fa:16:3e:7f:ac:22"/>
Jan 26 15:10:57 compute-1 nova_compute[183403]:       <model type="virtio"/>
Jan 26 15:10:57 compute-1 nova_compute[183403]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 15:10:57 compute-1 nova_compute[183403]:       <mtu size="1442"/>
Jan 26 15:10:57 compute-1 nova_compute[183403]:       <target dev="tapb50fc69b-cf"/>
Jan 26 15:10:57 compute-1 nova_compute[183403]:     </interface>
Jan 26 15:10:57 compute-1 nova_compute[183403]:     <serial type="pty">
Jan 26 15:10:57 compute-1 nova_compute[183403]:       <log file="/var/lib/nova/instances/981e6db3-c4e9-422b-91bb-a2c1c5869fc4/console.log" append="off"/>
Jan 26 15:10:57 compute-1 nova_compute[183403]:     </serial>
Jan 26 15:10:57 compute-1 nova_compute[183403]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 15:10:57 compute-1 nova_compute[183403]:     <video>
Jan 26 15:10:57 compute-1 nova_compute[183403]:       <model type="virtio"/>
Jan 26 15:10:57 compute-1 nova_compute[183403]:     </video>
Jan 26 15:10:57 compute-1 nova_compute[183403]:     <input type="tablet" bus="usb"/>
Jan 26 15:10:57 compute-1 nova_compute[183403]:     <rng model="virtio">
Jan 26 15:10:57 compute-1 nova_compute[183403]:       <backend model="random">/dev/urandom</backend>
Jan 26 15:10:57 compute-1 nova_compute[183403]:     </rng>
Jan 26 15:10:57 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root"/>
Jan 26 15:10:57 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:10:57 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:10:57 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:10:57 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:10:57 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:10:57 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:10:57 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:10:57 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:10:57 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:10:57 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:10:57 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:10:57 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:10:57 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:10:57 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:10:57 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:10:57 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:10:57 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:10:57 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:10:57 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:10:57 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:10:57 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:10:57 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:10:57 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:10:57 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:10:57 compute-1 nova_compute[183403]:     <controller type="usb" index="0"/>
Jan 26 15:10:57 compute-1 nova_compute[183403]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 15:10:57 compute-1 nova_compute[183403]:       <stats period="10"/>
Jan 26 15:10:57 compute-1 nova_compute[183403]:     </memballoon>
Jan 26 15:10:57 compute-1 nova_compute[183403]:   </devices>
Jan 26 15:10:57 compute-1 nova_compute[183403]: </domain>
Jan 26 15:10:57 compute-1 nova_compute[183403]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Jan 26 15:10:57 compute-1 nova_compute[183403]: 2026-01-26 15:10:57.685 183407 DEBUG nova.compute.manager [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Preparing to wait for external event network-vif-plugged-b50fc69b-cfde-429d-908f-cde6f56037bf prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 26 15:10:57 compute-1 nova_compute[183403]: 2026-01-26 15:10:57.686 183407 DEBUG oslo_concurrency.lockutils [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Acquiring lock "981e6db3-c4e9-422b-91bb-a2c1c5869fc4-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:10:57 compute-1 nova_compute[183403]: 2026-01-26 15:10:57.686 183407 DEBUG oslo_concurrency.lockutils [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "981e6db3-c4e9-422b-91bb-a2c1c5869fc4-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:10:57 compute-1 nova_compute[183403]: 2026-01-26 15:10:57.687 183407 DEBUG oslo_concurrency.lockutils [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "981e6db3-c4e9-422b-91bb-a2c1c5869fc4-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:10:57 compute-1 nova_compute[183403]: 2026-01-26 15:10:57.688 183407 DEBUG nova.virt.libvirt.vif [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-01-26T15:10:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-855077368',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-855077368',id=9,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6377892a338d4a7cbe63cf30bd2c63ea',ramdisk_id='',reservation_id='r-mka0m05y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,reader,member,admin',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-280856547',owner_user_name='tempest-TestExecuteActionsViaActuator-280856547-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T15:10:52Z,user_data=None,user_id='afb4f4811cb043dca89a8413c390ba3d',uuid=981e6db3-c4e9-422b-91bb-a2c1c5869fc4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b50fc69b-cfde-429d-908f-cde6f56037bf", "address": "fa:16:3e:7f:ac:22", "network": {"id": "d4a37c9f-5b64-4f94-80e9-126c911b1acf", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7dc5e9070a084dfcb543a08e87868f39", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb50fc69b-cf", "ovs_interfaceid": "b50fc69b-cfde-429d-908f-cde6f56037bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 26 15:10:57 compute-1 nova_compute[183403]: 2026-01-26 15:10:57.689 183407 DEBUG nova.network.os_vif_util [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Converting VIF {"id": "b50fc69b-cfde-429d-908f-cde6f56037bf", "address": "fa:16:3e:7f:ac:22", "network": {"id": "d4a37c9f-5b64-4f94-80e9-126c911b1acf", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7dc5e9070a084dfcb543a08e87868f39", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb50fc69b-cf", "ovs_interfaceid": "b50fc69b-cfde-429d-908f-cde6f56037bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:10:57 compute-1 nova_compute[183403]: 2026-01-26 15:10:57.690 183407 DEBUG nova.network.os_vif_util [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7f:ac:22,bridge_name='br-int',has_traffic_filtering=True,id=b50fc69b-cfde-429d-908f-cde6f56037bf,network=Network(d4a37c9f-5b64-4f94-80e9-126c911b1acf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb50fc69b-cf') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:10:57 compute-1 nova_compute[183403]: 2026-01-26 15:10:57.690 183407 DEBUG os_vif [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7f:ac:22,bridge_name='br-int',has_traffic_filtering=True,id=b50fc69b-cfde-429d-908f-cde6f56037bf,network=Network(d4a37c9f-5b64-4f94-80e9-126c911b1acf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb50fc69b-cf') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 26 15:10:57 compute-1 nova_compute[183403]: 2026-01-26 15:10:57.691 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:10:57 compute-1 nova_compute[183403]: 2026-01-26 15:10:57.692 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:10:57 compute-1 nova_compute[183403]: 2026-01-26 15:10:57.692 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:10:57 compute-1 nova_compute[183403]: 2026-01-26 15:10:57.694 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:10:57 compute-1 nova_compute[183403]: 2026-01-26 15:10:57.694 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'd5ad23c9-f44b-584f-af8c-a943561f1766', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:10:57 compute-1 nova_compute[183403]: 2026-01-26 15:10:57.696 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:10:57 compute-1 nova_compute[183403]: 2026-01-26 15:10:57.699 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:10:57 compute-1 nova_compute[183403]: 2026-01-26 15:10:57.703 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:10:57 compute-1 nova_compute[183403]: 2026-01-26 15:10:57.704 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb50fc69b-cf, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:10:57 compute-1 nova_compute[183403]: 2026-01-26 15:10:57.704 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapb50fc69b-cf, col_values=(('qos', UUID('fb96a913-e619-49d9-8787-9a482bf4ae58')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:10:57 compute-1 nova_compute[183403]: 2026-01-26 15:10:57.705 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapb50fc69b-cf, col_values=(('external_ids', {'iface-id': 'b50fc69b-cfde-429d-908f-cde6f56037bf', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7f:ac:22', 'vm-uuid': '981e6db3-c4e9-422b-91bb-a2c1c5869fc4'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:10:57 compute-1 nova_compute[183403]: 2026-01-26 15:10:57.707 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:10:57 compute-1 NetworkManager[55716]: <info>  [1769440257.7080] manager: (tapb50fc69b-cf): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/32)
Jan 26 15:10:57 compute-1 nova_compute[183403]: 2026-01-26 15:10:57.710 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:10:57 compute-1 nova_compute[183403]: 2026-01-26 15:10:57.715 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:10:57 compute-1 nova_compute[183403]: 2026-01-26 15:10:57.716 183407 INFO os_vif [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7f:ac:22,bridge_name='br-int',has_traffic_filtering=True,id=b50fc69b-cfde-429d-908f-cde6f56037bf,network=Network(d4a37c9f-5b64-4f94-80e9-126c911b1acf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb50fc69b-cf')
Jan 26 15:10:58 compute-1 nova_compute[183403]: 2026-01-26 15:10:58.407 183407 WARNING neutronclient.v2_0.client [req-9fc6b6a5-2d7d-4565-9ce0-51a6322d1bc6 req-36dc021f-b1b5-4d49-b0ee-f7e58b4c3e2e 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:10:58 compute-1 nova_compute[183403]: 2026-01-26 15:10:58.575 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:10:58 compute-1 nova_compute[183403]: 2026-01-26 15:10:58.601 183407 DEBUG nova.network.neutron [req-9fc6b6a5-2d7d-4565-9ce0-51a6322d1bc6 req-36dc021f-b1b5-4d49-b0ee-f7e58b4c3e2e 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Updated VIF entry in instance network info cache for port b50fc69b-cfde-429d-908f-cde6f56037bf. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Jan 26 15:10:58 compute-1 nova_compute[183403]: 2026-01-26 15:10:58.602 183407 DEBUG nova.network.neutron [req-9fc6b6a5-2d7d-4565-9ce0-51a6322d1bc6 req-36dc021f-b1b5-4d49-b0ee-f7e58b4c3e2e 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Updating instance_info_cache with network_info: [{"id": "b50fc69b-cfde-429d-908f-cde6f56037bf", "address": "fa:16:3e:7f:ac:22", "network": {"id": "d4a37c9f-5b64-4f94-80e9-126c911b1acf", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7dc5e9070a084dfcb543a08e87868f39", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb50fc69b-cf", "ovs_interfaceid": "b50fc69b-cfde-429d-908f-cde6f56037bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:10:59 compute-1 nova_compute[183403]: 2026-01-26 15:10:59.109 183407 DEBUG oslo_concurrency.lockutils [req-9fc6b6a5-2d7d-4565-9ce0-51a6322d1bc6 req-36dc021f-b1b5-4d49-b0ee-f7e58b4c3e2e 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Releasing lock "refresh_cache-981e6db3-c4e9-422b-91bb-a2c1c5869fc4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 15:10:59 compute-1 nova_compute[183403]: 2026-01-26 15:10:59.233 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:10:59 compute-1 nova_compute[183403]: 2026-01-26 15:10:59.322 183407 DEBUG nova.virt.libvirt.driver [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 15:10:59 compute-1 nova_compute[183403]: 2026-01-26 15:10:59.322 183407 DEBUG nova.virt.libvirt.driver [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 15:10:59 compute-1 nova_compute[183403]: 2026-01-26 15:10:59.322 183407 DEBUG nova.virt.libvirt.driver [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] No VIF found with MAC fa:16:3e:7f:ac:22, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Jan 26 15:10:59 compute-1 nova_compute[183403]: 2026-01-26 15:10:59.323 183407 INFO nova.virt.libvirt.driver [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Using config drive
Jan 26 15:10:59 compute-1 nova_compute[183403]: 2026-01-26 15:10:59.833 183407 WARNING neutronclient.v2_0.client [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:11:00 compute-1 nova_compute[183403]: 2026-01-26 15:11:00.172 183407 INFO nova.virt.libvirt.driver [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Creating config drive at /var/lib/nova/instances/981e6db3-c4e9-422b-91bb-a2c1c5869fc4/disk.config
Jan 26 15:11:00 compute-1 nova_compute[183403]: 2026-01-26 15:11:00.178 183407 DEBUG oslo_concurrency.processutils [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/981e6db3-c4e9-422b-91bb-a2c1c5869fc4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp_y9z6uqv execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:11:00 compute-1 nova_compute[183403]: 2026-01-26 15:11:00.323 183407 DEBUG oslo_concurrency.processutils [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/981e6db3-c4e9-422b-91bb-a2c1c5869fc4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp_y9z6uqv" returned: 0 in 0.145s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:11:00 compute-1 kernel: tapb50fc69b-cf: entered promiscuous mode
Jan 26 15:11:00 compute-1 NetworkManager[55716]: <info>  [1769440260.4194] manager: (tapb50fc69b-cf): new Tun device (/org/freedesktop/NetworkManager/Devices/33)
Jan 26 15:11:00 compute-1 ovn_controller[95641]: 2026-01-26T15:11:00Z|00062|binding|INFO|Claiming lport b50fc69b-cfde-429d-908f-cde6f56037bf for this chassis.
Jan 26 15:11:00 compute-1 ovn_controller[95641]: 2026-01-26T15:11:00Z|00063|binding|INFO|b50fc69b-cfde-429d-908f-cde6f56037bf: Claiming fa:16:3e:7f:ac:22 10.100.0.10
Jan 26 15:11:00 compute-1 nova_compute[183403]: 2026-01-26 15:11:00.427 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:11:00 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:11:00.432 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7f:ac:22 10.100.0.10'], port_security=['fa:16:3e:7f:ac:22 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '981e6db3-c4e9-422b-91bb-a2c1c5869fc4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d4a37c9f-5b64-4f94-80e9-126c911b1acf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6377892a338d4a7cbe63cf30bd2c63ea', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6ec487f2-f407-43f7-8fd3-02f4d5e73158', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=922d1c2a-bc46-47ee-81d5-242719303ef7, chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], tunnel_key=8, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], logical_port=b50fc69b-cfde-429d-908f-cde6f56037bf) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:11:00 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:11:00.433 104930 INFO neutron.agent.ovn.metadata.agent [-] Port b50fc69b-cfde-429d-908f-cde6f56037bf in datapath d4a37c9f-5b64-4f94-80e9-126c911b1acf bound to our chassis
Jan 26 15:11:00 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:11:00.435 104930 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d4a37c9f-5b64-4f94-80e9-126c911b1acf
Jan 26 15:11:00 compute-1 ovn_controller[95641]: 2026-01-26T15:11:00Z|00064|binding|INFO|Setting lport b50fc69b-cfde-429d-908f-cde6f56037bf ovn-installed in OVS
Jan 26 15:11:00 compute-1 ovn_controller[95641]: 2026-01-26T15:11:00Z|00065|binding|INFO|Setting lport b50fc69b-cfde-429d-908f-cde6f56037bf up in Southbound
Jan 26 15:11:00 compute-1 nova_compute[183403]: 2026-01-26 15:11:00.445 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:11:00 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:11:00.451 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[c503b124-5d72-4ad0-b58b-6a9700654385]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:11:00 compute-1 nova_compute[183403]: 2026-01-26 15:11:00.452 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:11:00 compute-1 systemd-machined[154697]: New machine qemu-5-instance-00000009.
Jan 26 15:11:00 compute-1 systemd[1]: Started Virtual Machine qemu-5-instance-00000009.
Jan 26 15:11:00 compute-1 systemd-udevd[206141]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 15:11:00 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:11:00.486 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[e7d34d34-5c6e-4686-a4d7-30abb1b0e010]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:11:00 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:11:00.491 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[c3e1166a-97fe-4e17-8f92-9936fad60469]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:11:00 compute-1 podman[206097]: 2026-01-26 15:11:00.491949591 +0000 UTC m=+0.082038881 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4)
Jan 26 15:11:00 compute-1 NetworkManager[55716]: <info>  [1769440260.5001] device (tapb50fc69b-cf): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 15:11:00 compute-1 NetworkManager[55716]: <info>  [1769440260.5008] device (tapb50fc69b-cf): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 15:11:00 compute-1 nova_compute[183403]: 2026-01-26 15:11:00.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:11:00 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:11:00.518 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[ae660d79-ab78-4127-aedb-b5936c40605e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:11:00 compute-1 podman[206096]: 2026-01-26 15:11:00.624115258 +0000 UTC m=+0.218491346 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible)
Jan 26 15:11:00 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:11:00.625 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[8780a0df-2ce5-495a-9d40-b2e753ae4991]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd4a37c9f-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:53:55:45'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 9, 'rx_bytes': 784, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 9, 'rx_bytes': 784, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 385314, 'reachable_time': 28113, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 206152, 'error': None, 'target': 'ovnmeta-d4a37c9f-5b64-4f94-80e9-126c911b1acf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:11:00 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:11:00.640 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[3b66fbd6-314d-470f-994d-79c82e9f7276]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd4a37c9f-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 385324, 'tstamp': 385324}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 206159, 'error': None, 'target': 'ovnmeta-d4a37c9f-5b64-4f94-80e9-126c911b1acf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd4a37c9f-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 385327, 'tstamp': 385327}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 206159, 'error': None, 'target': 'ovnmeta-d4a37c9f-5b64-4f94-80e9-126c911b1acf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:11:00 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:11:00.642 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd4a37c9f-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:11:00 compute-1 nova_compute[183403]: 2026-01-26 15:11:00.645 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:11:00 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:11:00.646 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd4a37c9f-50, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:11:00 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:11:00.646 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:11:00 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:11:00.647 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd4a37c9f-50, col_values=(('external_ids', {'iface-id': '3415b7f1-5b64-48d1-b20f-4c68422efc0e'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:11:00 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:11:00.647 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:11:00 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:11:00.649 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[363447fa-a8bb-4bd5-a024-23a3862de05b]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-d4a37c9f-5b64-4f94-80e9-126c911b1acf\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/d4a37c9f-5b64-4f94-80e9-126c911b1acf.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID d4a37c9f-5b64-4f94-80e9-126c911b1acf\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:11:01 compute-1 nova_compute[183403]: 2026-01-26 15:11:01.125 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:11:01 compute-1 nova_compute[183403]: 2026-01-26 15:11:01.126 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:11:01 compute-1 nova_compute[183403]: 2026-01-26 15:11:01.126 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:11:01 compute-1 nova_compute[183403]: 2026-01-26 15:11:01.126 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:11:01 compute-1 nova_compute[183403]: 2026-01-26 15:11:01.213 183407 DEBUG nova.compute.manager [req-ce8ea22e-c343-4ad5-aa70-adcad4d27113 req-c41e1de2-2cc8-4257-9853-8cc8e2324d0c 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Received event network-vif-plugged-b50fc69b-cfde-429d-908f-cde6f56037bf external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:11:01 compute-1 nova_compute[183403]: 2026-01-26 15:11:01.213 183407 DEBUG oslo_concurrency.lockutils [req-ce8ea22e-c343-4ad5-aa70-adcad4d27113 req-c41e1de2-2cc8-4257-9853-8cc8e2324d0c 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "981e6db3-c4e9-422b-91bb-a2c1c5869fc4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:11:01 compute-1 nova_compute[183403]: 2026-01-26 15:11:01.214 183407 DEBUG oslo_concurrency.lockutils [req-ce8ea22e-c343-4ad5-aa70-adcad4d27113 req-c41e1de2-2cc8-4257-9853-8cc8e2324d0c 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "981e6db3-c4e9-422b-91bb-a2c1c5869fc4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:11:01 compute-1 nova_compute[183403]: 2026-01-26 15:11:01.214 183407 DEBUG oslo_concurrency.lockutils [req-ce8ea22e-c343-4ad5-aa70-adcad4d27113 req-c41e1de2-2cc8-4257-9853-8cc8e2324d0c 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "981e6db3-c4e9-422b-91bb-a2c1c5869fc4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:11:01 compute-1 nova_compute[183403]: 2026-01-26 15:11:01.214 183407 DEBUG nova.compute.manager [req-ce8ea22e-c343-4ad5-aa70-adcad4d27113 req-c41e1de2-2cc8-4257-9853-8cc8e2324d0c 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Processing event network-vif-plugged-b50fc69b-cfde-429d-908f-cde6f56037bf _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 26 15:11:01 compute-1 nova_compute[183403]: 2026-01-26 15:11:01.215 183407 DEBUG nova.compute.manager [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 26 15:11:01 compute-1 nova_compute[183403]: 2026-01-26 15:11:01.234 183407 DEBUG nova.virt.libvirt.driver [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Jan 26 15:11:01 compute-1 nova_compute[183403]: 2026-01-26 15:11:01.238 183407 INFO nova.virt.libvirt.driver [-] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Instance spawned successfully.
Jan 26 15:11:01 compute-1 nova_compute[183403]: 2026-01-26 15:11:01.238 183407 DEBUG nova.virt.libvirt.driver [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Jan 26 15:11:01 compute-1 nova_compute[183403]: 2026-01-26 15:11:01.754 183407 DEBUG nova.virt.libvirt.driver [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:11:01 compute-1 nova_compute[183403]: 2026-01-26 15:11:01.755 183407 DEBUG nova.virt.libvirt.driver [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:11:01 compute-1 nova_compute[183403]: 2026-01-26 15:11:01.755 183407 DEBUG nova.virt.libvirt.driver [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:11:01 compute-1 nova_compute[183403]: 2026-01-26 15:11:01.756 183407 DEBUG nova.virt.libvirt.driver [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:11:01 compute-1 nova_compute[183403]: 2026-01-26 15:11:01.756 183407 DEBUG nova.virt.libvirt.driver [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:11:01 compute-1 nova_compute[183403]: 2026-01-26 15:11:01.757 183407 DEBUG nova.virt.libvirt.driver [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:11:02 compute-1 nova_compute[183403]: 2026-01-26 15:11:02.171 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/981e6db3-c4e9-422b-91bb-a2c1c5869fc4/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:11:02 compute-1 nova_compute[183403]: 2026-01-26 15:11:02.226 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/981e6db3-c4e9-422b-91bb-a2c1c5869fc4/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:11:02 compute-1 nova_compute[183403]: 2026-01-26 15:11:02.227 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/981e6db3-c4e9-422b-91bb-a2c1c5869fc4/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:11:02 compute-1 nova_compute[183403]: 2026-01-26 15:11:02.266 183407 INFO nova.compute.manager [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Took 8.40 seconds to spawn the instance on the hypervisor.
Jan 26 15:11:02 compute-1 nova_compute[183403]: 2026-01-26 15:11:02.268 183407 DEBUG nova.compute.manager [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Jan 26 15:11:02 compute-1 nova_compute[183403]: 2026-01-26 15:11:02.287 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/981e6db3-c4e9-422b-91bb-a2c1c5869fc4/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:11:02 compute-1 nova_compute[183403]: 2026-01-26 15:11:02.291 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/66a7af21-1abe-467f-b739-441e05a4b09a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:11:02 compute-1 nova_compute[183403]: 2026-01-26 15:11:02.344 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/66a7af21-1abe-467f-b739-441e05a4b09a/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:11:02 compute-1 nova_compute[183403]: 2026-01-26 15:11:02.345 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/66a7af21-1abe-467f-b739-441e05a4b09a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:11:02 compute-1 nova_compute[183403]: 2026-01-26 15:11:02.399 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/66a7af21-1abe-467f-b739-441e05a4b09a/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:11:02 compute-1 nova_compute[183403]: 2026-01-26 15:11:02.403 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/aab8c28e-0489-40bd-88cf-5eb7c419933a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:11:02 compute-1 nova_compute[183403]: 2026-01-26 15:11:02.544 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/aab8c28e-0489-40bd-88cf-5eb7c419933a/disk --force-share --output=json" returned: 0 in 0.141s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:11:02 compute-1 nova_compute[183403]: 2026-01-26 15:11:02.545 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/aab8c28e-0489-40bd-88cf-5eb7c419933a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:11:02 compute-1 nova_compute[183403]: 2026-01-26 15:11:02.607 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/aab8c28e-0489-40bd-88cf-5eb7c419933a/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:11:02 compute-1 nova_compute[183403]: 2026-01-26 15:11:02.614 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8c64a2e0-f723-4adb-84fc-867073a92349/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:11:02 compute-1 nova_compute[183403]: 2026-01-26 15:11:02.677 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8c64a2e0-f723-4adb-84fc-867073a92349/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:11:02 compute-1 nova_compute[183403]: 2026-01-26 15:11:02.679 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8c64a2e0-f723-4adb-84fc-867073a92349/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:11:02 compute-1 nova_compute[183403]: 2026-01-26 15:11:02.708 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:11:02 compute-1 nova_compute[183403]: 2026-01-26 15:11:02.760 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8c64a2e0-f723-4adb-84fc-867073a92349/disk --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:11:02 compute-1 nova_compute[183403]: 2026-01-26 15:11:02.960 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:11:02 compute-1 nova_compute[183403]: 2026-01-26 15:11:02.961 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:11:02 compute-1 nova_compute[183403]: 2026-01-26 15:11:02.984 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.023s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:11:02 compute-1 nova_compute[183403]: 2026-01-26 15:11:02.985 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5324MB free_disk=73.06208801269531GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:11:02 compute-1 nova_compute[183403]: 2026-01-26 15:11:02.985 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:11:02 compute-1 nova_compute[183403]: 2026-01-26 15:11:02.985 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:11:03 compute-1 nova_compute[183403]: 2026-01-26 15:11:03.058 183407 INFO nova.compute.manager [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Took 14.44 seconds to build instance.
Jan 26 15:11:03 compute-1 nova_compute[183403]: 2026-01-26 15:11:03.323 183407 DEBUG nova.compute.manager [req-329c1843-149c-4602-82f6-66942cdf1a36 req-f5737350-abab-494a-a7e5-3b52d37439fa 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Received event network-vif-plugged-b50fc69b-cfde-429d-908f-cde6f56037bf external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:11:03 compute-1 nova_compute[183403]: 2026-01-26 15:11:03.324 183407 DEBUG oslo_concurrency.lockutils [req-329c1843-149c-4602-82f6-66942cdf1a36 req-f5737350-abab-494a-a7e5-3b52d37439fa 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "981e6db3-c4e9-422b-91bb-a2c1c5869fc4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:11:03 compute-1 nova_compute[183403]: 2026-01-26 15:11:03.324 183407 DEBUG oslo_concurrency.lockutils [req-329c1843-149c-4602-82f6-66942cdf1a36 req-f5737350-abab-494a-a7e5-3b52d37439fa 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "981e6db3-c4e9-422b-91bb-a2c1c5869fc4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:11:03 compute-1 nova_compute[183403]: 2026-01-26 15:11:03.324 183407 DEBUG oslo_concurrency.lockutils [req-329c1843-149c-4602-82f6-66942cdf1a36 req-f5737350-abab-494a-a7e5-3b52d37439fa 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "981e6db3-c4e9-422b-91bb-a2c1c5869fc4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:11:03 compute-1 nova_compute[183403]: 2026-01-26 15:11:03.325 183407 DEBUG nova.compute.manager [req-329c1843-149c-4602-82f6-66942cdf1a36 req-f5737350-abab-494a-a7e5-3b52d37439fa 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] No waiting events found dispatching network-vif-plugged-b50fc69b-cfde-429d-908f-cde6f56037bf pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:11:03 compute-1 nova_compute[183403]: 2026-01-26 15:11:03.325 183407 WARNING nova.compute.manager [req-329c1843-149c-4602-82f6-66942cdf1a36 req-f5737350-abab-494a-a7e5-3b52d37439fa 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Received unexpected event network-vif-plugged-b50fc69b-cfde-429d-908f-cde6f56037bf for instance with vm_state active and task_state None.
Jan 26 15:11:03 compute-1 nova_compute[183403]: 2026-01-26 15:11:03.565 183407 DEBUG oslo_concurrency.lockutils [None req-ed8a9406-f42b-4145-8c5b-d597c861f7ee afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "981e6db3-c4e9-422b-91bb-a2c1c5869fc4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.964s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:11:04 compute-1 nova_compute[183403]: 2026-01-26 15:11:04.092 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Instance 66a7af21-1abe-467f-b739-441e05a4b09a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 15:11:04 compute-1 nova_compute[183403]: 2026-01-26 15:11:04.093 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Instance 8c64a2e0-f723-4adb-84fc-867073a92349 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 15:11:04 compute-1 nova_compute[183403]: 2026-01-26 15:11:04.094 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Instance aab8c28e-0489-40bd-88cf-5eb7c419933a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 15:11:04 compute-1 nova_compute[183403]: 2026-01-26 15:11:04.094 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Instance 981e6db3-c4e9-422b-91bb-a2c1c5869fc4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 15:11:04 compute-1 nova_compute[183403]: 2026-01-26 15:11:04.095 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 4 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:11:04 compute-1 nova_compute[183403]: 2026-01-26 15:11:04.095 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=1088MB phys_disk=79GB used_disk=4GB total_vcpus=8 used_vcpus=4 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:11:02 up  1:06,  0 user,  load average: 0.29, 0.22, 0.36\n', 'num_instances': '4', 'num_vm_active': '4', 'num_task_None': '4', 'num_os_type_None': '4', 'num_proj_6377892a338d4a7cbe63cf30bd2c63ea': '4', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:11:04 compute-1 nova_compute[183403]: 2026-01-26 15:11:04.168 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:11:04 compute-1 nova_compute[183403]: 2026-01-26 15:11:04.286 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:11:04 compute-1 nova_compute[183403]: 2026-01-26 15:11:04.894 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:11:05 compute-1 podman[192725]: time="2026-01-26T15:11:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:11:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:11:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16574 "" "Go-http-client/1.1"
Jan 26 15:11:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:11:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2643 "" "Go-http-client/1.1"
Jan 26 15:11:06 compute-1 nova_compute[183403]: 2026-01-26 15:11:06.138 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:11:06 compute-1 nova_compute[183403]: 2026-01-26 15:11:06.139 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.154s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:11:06 compute-1 nova_compute[183403]: 2026-01-26 15:11:06.139 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:11:06 compute-1 nova_compute[183403]: 2026-01-26 15:11:06.140 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Jan 26 15:11:06 compute-1 nova_compute[183403]: 2026-01-26 15:11:06.646 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Jan 26 15:11:07 compute-1 nova_compute[183403]: 2026-01-26 15:11:07.647 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:11:07 compute-1 nova_compute[183403]: 2026-01-26 15:11:07.648 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:11:07 compute-1 nova_compute[183403]: 2026-01-26 15:11:07.648 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:11:07 compute-1 nova_compute[183403]: 2026-01-26 15:11:07.648 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:11:07 compute-1 nova_compute[183403]: 2026-01-26 15:11:07.649 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:11:07 compute-1 nova_compute[183403]: 2026-01-26 15:11:07.649 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 15:11:07 compute-1 nova_compute[183403]: 2026-01-26 15:11:07.710 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:11:08 compute-1 nova_compute[183403]: 2026-01-26 15:11:08.572 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:11:08 compute-1 nova_compute[183403]: 2026-01-26 15:11:08.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:11:08 compute-1 nova_compute[183403]: 2026-01-26 15:11:08.576 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Jan 26 15:11:09 compute-1 nova_compute[183403]: 2026-01-26 15:11:09.324 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:11:09 compute-1 sshd-session[206084]: Invalid user oracle from 185.246.128.170 port 59750
Jan 26 15:11:12 compute-1 sshd-session[206084]: Disconnecting invalid user oracle 185.246.128.170 port 59750: Change of username or service not allowed: (oracle,ssh-connection) -> (1,ssh-connection) [preauth]
Jan 26 15:11:12 compute-1 nova_compute[183403]: 2026-01-26 15:11:12.413 183407 DEBUG nova.compute.manager [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10] Stashing vm_state: active _prep_resize /usr/lib/python3.12/site-packages/nova/compute/manager.py:6173
Jan 26 15:11:12 compute-1 nova_compute[183403]: 2026-01-26 15:11:12.712 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:11:12 compute-1 nova_compute[183403]: 2026-01-26 15:11:12.955 183407 DEBUG oslo_concurrency.lockutils [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:11:12 compute-1 nova_compute[183403]: 2026-01-26 15:11:12.955 183407 DEBUG oslo_concurrency.lockutils [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:11:13 compute-1 nova_compute[183403]: 2026-01-26 15:11:13.476 183407 DEBUG nova.objects.instance [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lazy-loading 'pci_requests' on Instance uuid 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:11:13 compute-1 nova_compute[183403]: 2026-01-26 15:11:13.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:11:13 compute-1 nova_compute[183403]: 2026-01-26 15:11:13.759 183407 DEBUG nova.virt.libvirt.driver [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 6532f73e-4d35-42af-b257-7ddf3cd08929] Creating tmpfile /var/lib/nova/instances/tmpw5eyqmrr to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Jan 26 15:11:13 compute-1 nova_compute[183403]: 2026-01-26 15:11:13.760 183407 WARNING neutronclient.v2_0.client [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:11:13 compute-1 nova_compute[183403]: 2026-01-26 15:11:13.879 183407 DEBUG nova.compute.manager [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=70656,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpw5eyqmrr',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9090
Jan 26 15:11:13 compute-1 nova_compute[183403]: 2026-01-26 15:11:13.988 183407 DEBUG nova.virt.hardware [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Jan 26 15:11:13 compute-1 nova_compute[183403]: 2026-01-26 15:11:13.989 183407 INFO nova.compute.claims [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10] Claim successful on node compute-1.ctlplane.example.com
Jan 26 15:11:13 compute-1 nova_compute[183403]: 2026-01-26 15:11:13.989 183407 DEBUG nova.objects.instance [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lazy-loading 'resources' on Instance uuid 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:11:14 compute-1 nova_compute[183403]: 2026-01-26 15:11:14.361 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:11:14 compute-1 nova_compute[183403]: 2026-01-26 15:11:14.495 183407 DEBUG nova.objects.base [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Object Instance<7484f2cd-93e8-4578-9c4a-5bc1e0b49d10> lazy-loaded attributes: pci_requests,resources wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Jan 26 15:11:14 compute-1 nova_compute[183403]: 2026-01-26 15:11:14.496 183407 DEBUG nova.objects.instance [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lazy-loading 'numa_topology' on Instance uuid 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:11:15 compute-1 nova_compute[183403]: 2026-01-26 15:11:15.002 183407 DEBUG nova.objects.base [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Object Instance<7484f2cd-93e8-4578-9c4a-5bc1e0b49d10> lazy-loaded attributes: pci_requests,resources,numa_topology wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Jan 26 15:11:15 compute-1 nova_compute[183403]: 2026-01-26 15:11:15.003 183407 DEBUG nova.objects.instance [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lazy-loading 'pci_devices' on Instance uuid 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:11:15 compute-1 ovn_controller[95641]: 2026-01-26T15:11:15Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7f:ac:22 10.100.0.10
Jan 26 15:11:15 compute-1 ovn_controller[95641]: 2026-01-26T15:11:15Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7f:ac:22 10.100.0.10
Jan 26 15:11:15 compute-1 nova_compute[183403]: 2026-01-26 15:11:15.513 183407 DEBUG nova.objects.base [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Object Instance<7484f2cd-93e8-4578-9c4a-5bc1e0b49d10> lazy-loaded attributes: pci_requests,resources,numa_topology,pci_devices wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Jan 26 15:11:15 compute-1 nova_compute[183403]: 2026-01-26 15:11:15.914 183407 WARNING neutronclient.v2_0.client [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:11:16 compute-1 nova_compute[183403]: 2026-01-26 15:11:16.028 183407 INFO nova.compute.resource_tracker [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10] Updating resource usage from migration ee4a7f7c-80ec-4fd1-aa47-e223dc5d8b77
Jan 26 15:11:16 compute-1 nova_compute[183403]: 2026-01-26 15:11:16.029 183407 DEBUG nova.compute.resource_tracker [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10] Starting to track incoming migration ee4a7f7c-80ec-4fd1-aa47-e223dc5d8b77 with flavor 74480e15-23e6-4569-8ef9-3ddf5ac8b981 _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Jan 26 15:11:16 compute-1 nova_compute[183403]: 2026-01-26 15:11:16.650 183407 DEBUG nova.compute.provider_tree [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:11:17 compute-1 nova_compute[183403]: 2026-01-26 15:11:17.158 183407 DEBUG nova.scheduler.client.report [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:11:17 compute-1 nova_compute[183403]: 2026-01-26 15:11:17.668 183407 DEBUG oslo_concurrency.lockutils [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 4.712s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:11:17 compute-1 nova_compute[183403]: 2026-01-26 15:11:17.668 183407 INFO nova.compute.manager [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10] Migrating
Jan 26 15:11:17 compute-1 nova_compute[183403]: 2026-01-26 15:11:17.715 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:11:19 compute-1 nova_compute[183403]: 2026-01-26 15:11:19.410 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:11:19 compute-1 openstack_network_exporter[195610]: ERROR   15:11:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:11:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:11:19 compute-1 openstack_network_exporter[195610]: ERROR   15:11:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:11:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:11:19 compute-1 sshd-session[206218]: Invalid user 1 from 185.246.128.170 port 7983
Jan 26 15:11:19 compute-1 podman[206221]: 2026-01-26 15:11:19.926222183 +0000 UTC m=+0.094105158 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_id=openstack_network_exporter, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, name=ubi9-minimal, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, vendor=Red Hat, Inc., version=9.6, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, architecture=x86_64, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 26 15:11:19 compute-1 podman[206220]: 2026-01-26 15:11:19.944450566 +0000 UTC m=+0.108190238 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 15:11:20 compute-1 sshd-session[206218]: Disconnecting invalid user 1 185.246.128.170 port 7983: Change of username or service not allowed: (1,ssh-connection) -> (pwrchute,ssh-connection) [preauth]
Jan 26 15:11:20 compute-1 nova_compute[183403]: 2026-01-26 15:11:20.739 183407 DEBUG nova.compute.manager [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=70656,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpw5eyqmrr',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='6532f73e-4d35-42af-b257-7ddf3cd08929',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9315
Jan 26 15:11:21 compute-1 nova_compute[183403]: 2026-01-26 15:11:21.755 183407 DEBUG oslo_concurrency.lockutils [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "refresh_cache-6532f73e-4d35-42af-b257-7ddf3cd08929" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 15:11:21 compute-1 nova_compute[183403]: 2026-01-26 15:11:21.756 183407 DEBUG oslo_concurrency.lockutils [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquired lock "refresh_cache-6532f73e-4d35-42af-b257-7ddf3cd08929" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 15:11:21 compute-1 nova_compute[183403]: 2026-01-26 15:11:21.756 183407 DEBUG nova.network.neutron [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 6532f73e-4d35-42af-b257-7ddf3cd08929] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 15:11:22 compute-1 nova_compute[183403]: 2026-01-26 15:11:22.262 183407 WARNING neutronclient.v2_0.client [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:11:22 compute-1 nova_compute[183403]: 2026-01-26 15:11:22.752 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:11:22 compute-1 nova_compute[183403]: 2026-01-26 15:11:22.844 183407 WARNING neutronclient.v2_0.client [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:11:22 compute-1 sshd-session[206264]: Accepted publickey for nova from 192.168.122.100 port 43816 ssh2: ECDSA SHA256:T5sQaZjFwfSMOu21b+lyPX98YYE4+kdOeN+QFPxwhQE
Jan 26 15:11:22 compute-1 systemd[1]: Created slice User Slice of UID 42436.
Jan 26 15:11:22 compute-1 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 26 15:11:22 compute-1 systemd-logind[795]: New session 34 of user nova.
Jan 26 15:11:22 compute-1 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 26 15:11:22 compute-1 systemd[1]: Starting User Manager for UID 42436...
Jan 26 15:11:22 compute-1 systemd[206268]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 26 15:11:22 compute-1 nova_compute[183403]: 2026-01-26 15:11:22.972 183407 DEBUG nova.network.neutron [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 6532f73e-4d35-42af-b257-7ddf3cd08929] Updating instance_info_cache with network_info: [{"id": "2fe6de1e-4369-4693-b470-ed7d9e9f51ef", "address": "fa:16:3e:a4:ff:b6", "network": {"id": "d4a37c9f-5b64-4f94-80e9-126c911b1acf", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7dc5e9070a084dfcb543a08e87868f39", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fe6de1e-43", "ovs_interfaceid": "2fe6de1e-4369-4693-b470-ed7d9e9f51ef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:11:23 compute-1 systemd[206268]: Queued start job for default target Main User Target.
Jan 26 15:11:23 compute-1 systemd[206268]: Created slice User Application Slice.
Jan 26 15:11:23 compute-1 systemd[206268]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 26 15:11:23 compute-1 systemd[206268]: Started Daily Cleanup of User's Temporary Directories.
Jan 26 15:11:23 compute-1 systemd[206268]: Reached target Paths.
Jan 26 15:11:23 compute-1 systemd[206268]: Reached target Timers.
Jan 26 15:11:23 compute-1 systemd[206268]: Starting D-Bus User Message Bus Socket...
Jan 26 15:11:23 compute-1 systemd[206268]: Starting Create User's Volatile Files and Directories...
Jan 26 15:11:23 compute-1 systemd[206268]: Finished Create User's Volatile Files and Directories.
Jan 26 15:11:23 compute-1 systemd[206268]: Listening on D-Bus User Message Bus Socket.
Jan 26 15:11:23 compute-1 systemd[206268]: Reached target Sockets.
Jan 26 15:11:23 compute-1 systemd[206268]: Reached target Basic System.
Jan 26 15:11:23 compute-1 systemd[206268]: Reached target Main User Target.
Jan 26 15:11:23 compute-1 systemd[206268]: Startup finished in 176ms.
Jan 26 15:11:23 compute-1 systemd[1]: Started User Manager for UID 42436.
Jan 26 15:11:23 compute-1 systemd[1]: Started Session 34 of User nova.
Jan 26 15:11:23 compute-1 sshd-session[206264]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 26 15:11:23 compute-1 sshd-session[206283]: Received disconnect from 192.168.122.100 port 43816:11: disconnected by user
Jan 26 15:11:23 compute-1 sshd-session[206283]: Disconnected from user nova 192.168.122.100 port 43816
Jan 26 15:11:23 compute-1 sshd-session[206264]: pam_unix(sshd:session): session closed for user nova
Jan 26 15:11:23 compute-1 systemd[1]: session-34.scope: Deactivated successfully.
Jan 26 15:11:23 compute-1 systemd-logind[795]: Session 34 logged out. Waiting for processes to exit.
Jan 26 15:11:23 compute-1 systemd-logind[795]: Removed session 34.
Jan 26 15:11:23 compute-1 sshd-session[206285]: Accepted publickey for nova from 192.168.122.100 port 43828 ssh2: ECDSA SHA256:T5sQaZjFwfSMOu21b+lyPX98YYE4+kdOeN+QFPxwhQE
Jan 26 15:11:23 compute-1 systemd-logind[795]: New session 36 of user nova.
Jan 26 15:11:23 compute-1 systemd[1]: Started Session 36 of User nova.
Jan 26 15:11:23 compute-1 sshd-session[206285]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 26 15:11:23 compute-1 sshd-session[206288]: Received disconnect from 192.168.122.100 port 43828:11: disconnected by user
Jan 26 15:11:23 compute-1 sshd-session[206288]: Disconnected from user nova 192.168.122.100 port 43828
Jan 26 15:11:23 compute-1 sshd-session[206285]: pam_unix(sshd:session): session closed for user nova
Jan 26 15:11:23 compute-1 systemd[1]: session-36.scope: Deactivated successfully.
Jan 26 15:11:23 compute-1 systemd-logind[795]: Session 36 logged out. Waiting for processes to exit.
Jan 26 15:11:23 compute-1 systemd-logind[795]: Removed session 36.
Jan 26 15:11:23 compute-1 nova_compute[183403]: 2026-01-26 15:11:23.663 183407 DEBUG oslo_concurrency.lockutils [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Releasing lock "refresh_cache-6532f73e-4d35-42af-b257-7ddf3cd08929" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 15:11:23 compute-1 nova_compute[183403]: 2026-01-26 15:11:23.677 183407 DEBUG nova.virt.libvirt.driver [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 6532f73e-4d35-42af-b257-7ddf3cd08929] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=70656,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpw5eyqmrr',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='6532f73e-4d35-42af-b257-7ddf3cd08929',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Jan 26 15:11:23 compute-1 nova_compute[183403]: 2026-01-26 15:11:23.678 183407 DEBUG nova.virt.libvirt.driver [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 6532f73e-4d35-42af-b257-7ddf3cd08929] Creating instance directory: /var/lib/nova/instances/6532f73e-4d35-42af-b257-7ddf3cd08929 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Jan 26 15:11:23 compute-1 nova_compute[183403]: 2026-01-26 15:11:23.678 183407 DEBUG nova.virt.libvirt.driver [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 6532f73e-4d35-42af-b257-7ddf3cd08929] Creating disk.info with the contents: {'/var/lib/nova/instances/6532f73e-4d35-42af-b257-7ddf3cd08929/disk': 'qcow2', '/var/lib/nova/instances/6532f73e-4d35-42af-b257-7ddf3cd08929/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Jan 26 15:11:23 compute-1 nova_compute[183403]: 2026-01-26 15:11:23.679 183407 DEBUG nova.virt.libvirt.driver [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 6532f73e-4d35-42af-b257-7ddf3cd08929] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Jan 26 15:11:23 compute-1 nova_compute[183403]: 2026-01-26 15:11:23.679 183407 DEBUG nova.objects.instance [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lazy-loading 'trusted_certs' on Instance uuid 6532f73e-4d35-42af-b257-7ddf3cd08929 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:11:24 compute-1 nova_compute[183403]: 2026-01-26 15:11:24.187 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:11:24 compute-1 nova_compute[183403]: 2026-01-26 15:11:24.193 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:11:24 compute-1 nova_compute[183403]: 2026-01-26 15:11:24.196 183407 DEBUG oslo_concurrency.processutils [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:11:24 compute-1 nova_compute[183403]: 2026-01-26 15:11:24.257 183407 DEBUG oslo_concurrency.processutils [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:11:24 compute-1 nova_compute[183403]: 2026-01-26 15:11:24.259 183407 DEBUG oslo_concurrency.lockutils [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:11:24 compute-1 nova_compute[183403]: 2026-01-26 15:11:24.260 183407 DEBUG oslo_concurrency.lockutils [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:11:24 compute-1 nova_compute[183403]: 2026-01-26 15:11:24.260 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:11:24 compute-1 nova_compute[183403]: 2026-01-26 15:11:24.267 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:11:24 compute-1 nova_compute[183403]: 2026-01-26 15:11:24.267 183407 DEBUG oslo_concurrency.processutils [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:11:24 compute-1 nova_compute[183403]: 2026-01-26 15:11:24.327 183407 DEBUG oslo_concurrency.processutils [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:11:24 compute-1 nova_compute[183403]: 2026-01-26 15:11:24.328 183407 DEBUG oslo_concurrency.processutils [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0,backing_fmt=raw /var/lib/nova/instances/6532f73e-4d35-42af-b257-7ddf3cd08929/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:11:24 compute-1 nova_compute[183403]: 2026-01-26 15:11:24.399 183407 DEBUG oslo_concurrency.processutils [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0,backing_fmt=raw /var/lib/nova/instances/6532f73e-4d35-42af-b257-7ddf3cd08929/disk 1073741824" returned: 0 in 0.072s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:11:24 compute-1 nova_compute[183403]: 2026-01-26 15:11:24.400 183407 DEBUG oslo_concurrency.lockutils [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.141s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:11:24 compute-1 nova_compute[183403]: 2026-01-26 15:11:24.401 183407 DEBUG oslo_concurrency.processutils [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:11:24 compute-1 nova_compute[183403]: 2026-01-26 15:11:24.413 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:11:24 compute-1 nova_compute[183403]: 2026-01-26 15:11:24.470 183407 DEBUG oslo_concurrency.processutils [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:11:24 compute-1 nova_compute[183403]: 2026-01-26 15:11:24.470 183407 DEBUG nova.virt.disk.api [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Checking if we can resize image /var/lib/nova/instances/6532f73e-4d35-42af-b257-7ddf3cd08929/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 26 15:11:24 compute-1 nova_compute[183403]: 2026-01-26 15:11:24.471 183407 DEBUG oslo_concurrency.processutils [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6532f73e-4d35-42af-b257-7ddf3cd08929/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:11:24 compute-1 nova_compute[183403]: 2026-01-26 15:11:24.546 183407 DEBUG oslo_concurrency.processutils [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6532f73e-4d35-42af-b257-7ddf3cd08929/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:11:24 compute-1 nova_compute[183403]: 2026-01-26 15:11:24.547 183407 DEBUG nova.virt.disk.api [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Cannot resize image /var/lib/nova/instances/6532f73e-4d35-42af-b257-7ddf3cd08929/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 26 15:11:24 compute-1 nova_compute[183403]: 2026-01-26 15:11:24.548 183407 DEBUG nova.objects.instance [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lazy-loading 'migration_context' on Instance uuid 6532f73e-4d35-42af-b257-7ddf3cd08929 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:11:25 compute-1 nova_compute[183403]: 2026-01-26 15:11:25.058 183407 DEBUG nova.objects.base [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Object Instance<6532f73e-4d35-42af-b257-7ddf3cd08929> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Jan 26 15:11:25 compute-1 nova_compute[183403]: 2026-01-26 15:11:25.058 183407 DEBUG oslo_concurrency.processutils [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/6532f73e-4d35-42af-b257-7ddf3cd08929/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:11:25 compute-1 nova_compute[183403]: 2026-01-26 15:11:25.088 183407 DEBUG oslo_concurrency.processutils [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/6532f73e-4d35-42af-b257-7ddf3cd08929/disk.config 497664" returned: 0 in 0.029s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:11:25 compute-1 nova_compute[183403]: 2026-01-26 15:11:25.089 183407 DEBUG nova.virt.libvirt.driver [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 6532f73e-4d35-42af-b257-7ddf3cd08929] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Jan 26 15:11:25 compute-1 nova_compute[183403]: 2026-01-26 15:11:25.090 183407 DEBUG nova.virt.libvirt.vif [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-26T15:09:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-367519052',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-367519052',id=6,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T15:09:52Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6377892a338d4a7cbe63cf30bd2c63ea',ramdisk_id='',reservation_id='r-jy1lc3dj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,reader,member,admin',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-280856547',owner_user_name='tempest-TestExecuteActionsViaActuator-280856547-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T15:09:52Z,user_data=None,user_id='afb4f4811cb043dca89a8413c390ba3d',uuid=6532f73e-4d35-42af-b257-7ddf3cd08929,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2fe6de1e-4369-4693-b470-ed7d9e9f51ef", "address": "fa:16:3e:a4:ff:b6", "network": {"id": "d4a37c9f-5b64-4f94-80e9-126c911b1acf", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7dc5e9070a084dfcb543a08e87868f39", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap2fe6de1e-43", "ovs_interfaceid": "2fe6de1e-4369-4693-b470-ed7d9e9f51ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 26 15:11:25 compute-1 nova_compute[183403]: 2026-01-26 15:11:25.091 183407 DEBUG nova.network.os_vif_util [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Converting VIF {"id": "2fe6de1e-4369-4693-b470-ed7d9e9f51ef", "address": "fa:16:3e:a4:ff:b6", "network": {"id": "d4a37c9f-5b64-4f94-80e9-126c911b1acf", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7dc5e9070a084dfcb543a08e87868f39", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap2fe6de1e-43", "ovs_interfaceid": "2fe6de1e-4369-4693-b470-ed7d9e9f51ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:11:25 compute-1 nova_compute[183403]: 2026-01-26 15:11:25.092 183407 DEBUG nova.network.os_vif_util [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a4:ff:b6,bridge_name='br-int',has_traffic_filtering=True,id=2fe6de1e-4369-4693-b470-ed7d9e9f51ef,network=Network(d4a37c9f-5b64-4f94-80e9-126c911b1acf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2fe6de1e-43') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:11:25 compute-1 nova_compute[183403]: 2026-01-26 15:11:25.092 183407 DEBUG os_vif [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a4:ff:b6,bridge_name='br-int',has_traffic_filtering=True,id=2fe6de1e-4369-4693-b470-ed7d9e9f51ef,network=Network(d4a37c9f-5b64-4f94-80e9-126c911b1acf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2fe6de1e-43') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 26 15:11:25 compute-1 nova_compute[183403]: 2026-01-26 15:11:25.093 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:11:25 compute-1 nova_compute[183403]: 2026-01-26 15:11:25.093 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:11:25 compute-1 nova_compute[183403]: 2026-01-26 15:11:25.094 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:11:25 compute-1 nova_compute[183403]: 2026-01-26 15:11:25.094 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:11:25 compute-1 nova_compute[183403]: 2026-01-26 15:11:25.095 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'f322d831-938e-5fde-b570-251d54f7cccd', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:11:25 compute-1 nova_compute[183403]: 2026-01-26 15:11:25.096 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:11:25 compute-1 nova_compute[183403]: 2026-01-26 15:11:25.099 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:11:25 compute-1 nova_compute[183403]: 2026-01-26 15:11:25.103 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:11:25 compute-1 nova_compute[183403]: 2026-01-26 15:11:25.103 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2fe6de1e-43, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:11:25 compute-1 nova_compute[183403]: 2026-01-26 15:11:25.103 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap2fe6de1e-43, col_values=(('qos', UUID('3320c8e3-eb3c-4a8b-b1b8-8812d8af6275')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:11:25 compute-1 nova_compute[183403]: 2026-01-26 15:11:25.104 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap2fe6de1e-43, col_values=(('external_ids', {'iface-id': '2fe6de1e-4369-4693-b470-ed7d9e9f51ef', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a4:ff:b6', 'vm-uuid': '6532f73e-4d35-42af-b257-7ddf3cd08929'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:11:25 compute-1 nova_compute[183403]: 2026-01-26 15:11:25.105 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:11:25 compute-1 NetworkManager[55716]: <info>  [1769440285.1065] manager: (tap2fe6de1e-43): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/34)
Jan 26 15:11:25 compute-1 nova_compute[183403]: 2026-01-26 15:11:25.107 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:11:25 compute-1 nova_compute[183403]: 2026-01-26 15:11:25.112 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:11:25 compute-1 nova_compute[183403]: 2026-01-26 15:11:25.113 183407 INFO os_vif [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a4:ff:b6,bridge_name='br-int',has_traffic_filtering=True,id=2fe6de1e-4369-4693-b470-ed7d9e9f51ef,network=Network(d4a37c9f-5b64-4f94-80e9-126c911b1acf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2fe6de1e-43')
Jan 26 15:11:25 compute-1 nova_compute[183403]: 2026-01-26 15:11:25.113 183407 DEBUG nova.virt.libvirt.driver [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Jan 26 15:11:25 compute-1 nova_compute[183403]: 2026-01-26 15:11:25.113 183407 DEBUG nova.compute.manager [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=70656,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpw5eyqmrr',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='6532f73e-4d35-42af-b257-7ddf3cd08929',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9381
Jan 26 15:11:25 compute-1 nova_compute[183403]: 2026-01-26 15:11:25.114 183407 WARNING neutronclient.v2_0.client [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:11:25 compute-1 nova_compute[183403]: 2026-01-26 15:11:25.960 183407 WARNING neutronclient.v2_0.client [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:11:26 compute-1 sshd-session[206311]: Accepted publickey for nova from 192.168.122.100 port 43834 ssh2: ECDSA SHA256:T5sQaZjFwfSMOu21b+lyPX98YYE4+kdOeN+QFPxwhQE
Jan 26 15:11:26 compute-1 systemd-logind[795]: New session 37 of user nova.
Jan 26 15:11:26 compute-1 systemd[1]: Started Session 37 of User nova.
Jan 26 15:11:26 compute-1 sshd-session[206311]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 26 15:11:26 compute-1 nova_compute[183403]: 2026-01-26 15:11:26.896 183407 DEBUG nova.compute.manager [req-7dabeb4b-7133-415e-95b6-c985f561dd91 req-70226303-4b81-47f6-a016-85ec3e20455e 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10] Received event network-vif-unplugged-2282f681-f2e6-4601-a493-3244e752f7a4 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:11:26 compute-1 nova_compute[183403]: 2026-01-26 15:11:26.897 183407 DEBUG oslo_concurrency.lockutils [req-7dabeb4b-7133-415e-95b6-c985f561dd91 req-70226303-4b81-47f6-a016-85ec3e20455e 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "7484f2cd-93e8-4578-9c4a-5bc1e0b49d10-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:11:26 compute-1 nova_compute[183403]: 2026-01-26 15:11:26.897 183407 DEBUG oslo_concurrency.lockutils [req-7dabeb4b-7133-415e-95b6-c985f561dd91 req-70226303-4b81-47f6-a016-85ec3e20455e 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "7484f2cd-93e8-4578-9c4a-5bc1e0b49d10-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:11:26 compute-1 nova_compute[183403]: 2026-01-26 15:11:26.897 183407 DEBUG oslo_concurrency.lockutils [req-7dabeb4b-7133-415e-95b6-c985f561dd91 req-70226303-4b81-47f6-a016-85ec3e20455e 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "7484f2cd-93e8-4578-9c4a-5bc1e0b49d10-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:11:26 compute-1 nova_compute[183403]: 2026-01-26 15:11:26.897 183407 DEBUG nova.compute.manager [req-7dabeb4b-7133-415e-95b6-c985f561dd91 req-70226303-4b81-47f6-a016-85ec3e20455e 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10] No waiting events found dispatching network-vif-unplugged-2282f681-f2e6-4601-a493-3244e752f7a4 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:11:26 compute-1 nova_compute[183403]: 2026-01-26 15:11:26.898 183407 WARNING nova.compute.manager [req-7dabeb4b-7133-415e-95b6-c985f561dd91 req-70226303-4b81-47f6-a016-85ec3e20455e 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10] Received unexpected event network-vif-unplugged-2282f681-f2e6-4601-a493-3244e752f7a4 for instance with vm_state active and task_state resize_migrating.
Jan 26 15:11:27 compute-1 sshd-session[206314]: Received disconnect from 192.168.122.100 port 43834:11: disconnected by user
Jan 26 15:11:27 compute-1 sshd-session[206314]: Disconnected from user nova 192.168.122.100 port 43834
Jan 26 15:11:27 compute-1 sshd-session[206311]: pam_unix(sshd:session): session closed for user nova
Jan 26 15:11:27 compute-1 systemd[1]: session-37.scope: Deactivated successfully.
Jan 26 15:11:27 compute-1 systemd-logind[795]: Session 37 logged out. Waiting for processes to exit.
Jan 26 15:11:27 compute-1 systemd-logind[795]: Removed session 37.
Jan 26 15:11:27 compute-1 sshd-session[206316]: Accepted publickey for nova from 192.168.122.100 port 49102 ssh2: ECDSA SHA256:T5sQaZjFwfSMOu21b+lyPX98YYE4+kdOeN+QFPxwhQE
Jan 26 15:11:27 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:11:27.297 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:ac:38', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'de:96:40:90:f3:e5'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:11:27 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:11:27.298 104930 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 15:11:27 compute-1 nova_compute[183403]: 2026-01-26 15:11:27.334 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:11:27 compute-1 systemd-logind[795]: New session 38 of user nova.
Jan 26 15:11:27 compute-1 systemd[1]: Started Session 38 of User nova.
Jan 26 15:11:27 compute-1 sshd-session[206316]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 26 15:11:27 compute-1 sshd-session[206320]: Received disconnect from 192.168.122.100 port 49102:11: disconnected by user
Jan 26 15:11:27 compute-1 sshd-session[206320]: Disconnected from user nova 192.168.122.100 port 49102
Jan 26 15:11:27 compute-1 sshd-session[206316]: pam_unix(sshd:session): session closed for user nova
Jan 26 15:11:27 compute-1 systemd-logind[795]: Session 38 logged out. Waiting for processes to exit.
Jan 26 15:11:27 compute-1 systemd[1]: session-38.scope: Deactivated successfully.
Jan 26 15:11:27 compute-1 systemd-logind[795]: Removed session 38.
Jan 26 15:11:27 compute-1 sshd-session[206322]: Accepted publickey for nova from 192.168.122.100 port 49108 ssh2: ECDSA SHA256:T5sQaZjFwfSMOu21b+lyPX98YYE4+kdOeN+QFPxwhQE
Jan 26 15:11:27 compute-1 systemd-logind[795]: New session 39 of user nova.
Jan 26 15:11:27 compute-1 systemd[1]: Started Session 39 of User nova.
Jan 26 15:11:27 compute-1 sshd-session[206322]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 26 15:11:27 compute-1 sshd-session[206325]: Received disconnect from 192.168.122.100 port 49108:11: disconnected by user
Jan 26 15:11:27 compute-1 sshd-session[206325]: Disconnected from user nova 192.168.122.100 port 49108
Jan 26 15:11:27 compute-1 sshd-session[206322]: pam_unix(sshd:session): session closed for user nova
Jan 26 15:11:27 compute-1 systemd[1]: session-39.scope: Deactivated successfully.
Jan 26 15:11:27 compute-1 systemd-logind[795]: Session 39 logged out. Waiting for processes to exit.
Jan 26 15:11:27 compute-1 systemd-logind[795]: Removed session 39.
Jan 26 15:11:28 compute-1 nova_compute[183403]: 2026-01-26 15:11:28.981 183407 DEBUG nova.compute.manager [req-f02a12e6-7a63-4520-a926-e1391a816a8e req-e60a169a-a3cb-4be3-a2cb-1ecd8e1797c6 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10] Received event network-vif-unplugged-2282f681-f2e6-4601-a493-3244e752f7a4 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:11:28 compute-1 nova_compute[183403]: 2026-01-26 15:11:28.982 183407 DEBUG oslo_concurrency.lockutils [req-f02a12e6-7a63-4520-a926-e1391a816a8e req-e60a169a-a3cb-4be3-a2cb-1ecd8e1797c6 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "7484f2cd-93e8-4578-9c4a-5bc1e0b49d10-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:11:28 compute-1 nova_compute[183403]: 2026-01-26 15:11:28.982 183407 DEBUG oslo_concurrency.lockutils [req-f02a12e6-7a63-4520-a926-e1391a816a8e req-e60a169a-a3cb-4be3-a2cb-1ecd8e1797c6 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "7484f2cd-93e8-4578-9c4a-5bc1e0b49d10-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:11:28 compute-1 nova_compute[183403]: 2026-01-26 15:11:28.982 183407 DEBUG oslo_concurrency.lockutils [req-f02a12e6-7a63-4520-a926-e1391a816a8e req-e60a169a-a3cb-4be3-a2cb-1ecd8e1797c6 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "7484f2cd-93e8-4578-9c4a-5bc1e0b49d10-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:11:28 compute-1 nova_compute[183403]: 2026-01-26 15:11:28.982 183407 DEBUG nova.compute.manager [req-f02a12e6-7a63-4520-a926-e1391a816a8e req-e60a169a-a3cb-4be3-a2cb-1ecd8e1797c6 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10] No waiting events found dispatching network-vif-unplugged-2282f681-f2e6-4601-a493-3244e752f7a4 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:11:28 compute-1 nova_compute[183403]: 2026-01-26 15:11:28.982 183407 WARNING nova.compute.manager [req-f02a12e6-7a63-4520-a926-e1391a816a8e req-e60a169a-a3cb-4be3-a2cb-1ecd8e1797c6 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10] Received unexpected event network-vif-unplugged-2282f681-f2e6-4601-a493-3244e752f7a4 for instance with vm_state active and task_state resize_migrated.
Jan 26 15:11:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:11:29.035 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:11:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:11:29.035 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:11:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:11:29.035 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:11:29 compute-1 nova_compute[183403]: 2026-01-26 15:11:29.122 183407 DEBUG nova.network.neutron [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 6532f73e-4d35-42af-b257-7ddf3cd08929] Port 2fe6de1e-4369-4693-b470-ed7d9e9f51ef updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Jan 26 15:11:29 compute-1 nova_compute[183403]: 2026-01-26 15:11:29.235 183407 DEBUG nova.compute.manager [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=70656,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpw5eyqmrr',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='6532f73e-4d35-42af-b257-7ddf3cd08929',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9447
Jan 26 15:11:29 compute-1 nova_compute[183403]: 2026-01-26 15:11:29.415 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:11:30 compute-1 nova_compute[183403]: 2026-01-26 15:11:30.106 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:11:30 compute-1 sshd-session[206263]: Invalid user pwrchute from 185.246.128.170 port 12714
Jan 26 15:11:30 compute-1 podman[206344]: 2026-01-26 15:11:30.920075638 +0000 UTC m=+0.087653148 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 15:11:30 compute-1 podman[206343]: 2026-01-26 15:11:30.964532832 +0000 UTC m=+0.133091886 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2)
Jan 26 15:11:31 compute-1 nova_compute[183403]: 2026-01-26 15:11:31.361 183407 WARNING neutronclient.v2_0.client [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:11:31 compute-1 nova_compute[183403]: 2026-01-26 15:11:31.696 183407 INFO nova.network.neutron [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10] Updating port 2282f681-f2e6-4601-a493-3244e752f7a4 with attributes {'binding:host_id': 'compute-1.ctlplane.example.com', 'device_owner': 'compute:nova'}
Jan 26 15:11:33 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:11:33.299 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=41380b5a-e321-4ce4-bcc6-ecd563b3c793, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:11:33 compute-1 systemd[1]: Starting libvirt proxy daemon...
Jan 26 15:11:33 compute-1 systemd[1]: Started libvirt proxy daemon.
Jan 26 15:11:33 compute-1 sshd-session[206263]: Disconnecting invalid user pwrchute 185.246.128.170 port 12714: Change of username or service not allowed: (pwrchute,ssh-connection) -> (github,ssh-connection) [preauth]
Jan 26 15:11:33 compute-1 kernel: tap2fe6de1e-43: entered promiscuous mode
Jan 26 15:11:33 compute-1 NetworkManager[55716]: <info>  [1769440293.6710] manager: (tap2fe6de1e-43): new Tun device (/org/freedesktop/NetworkManager/Devices/35)
Jan 26 15:11:33 compute-1 nova_compute[183403]: 2026-01-26 15:11:33.672 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:11:33 compute-1 ovn_controller[95641]: 2026-01-26T15:11:33Z|00066|binding|INFO|Claiming lport 2fe6de1e-4369-4693-b470-ed7d9e9f51ef for this additional chassis.
Jan 26 15:11:33 compute-1 ovn_controller[95641]: 2026-01-26T15:11:33Z|00067|binding|INFO|2fe6de1e-4369-4693-b470-ed7d9e9f51ef: Claiming fa:16:3e:a4:ff:b6 10.100.0.7
Jan 26 15:11:33 compute-1 nova_compute[183403]: 2026-01-26 15:11:33.676 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:11:33 compute-1 ovn_controller[95641]: 2026-01-26T15:11:33Z|00068|binding|INFO|Setting lport 2fe6de1e-4369-4693-b470-ed7d9e9f51ef ovn-installed in OVS
Jan 26 15:11:33 compute-1 nova_compute[183403]: 2026-01-26 15:11:33.688 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:11:33 compute-1 nova_compute[183403]: 2026-01-26 15:11:33.690 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:11:33 compute-1 systemd-machined[154697]: New machine qemu-6-instance-00000006.
Jan 26 15:11:33 compute-1 systemd[1]: Started Virtual Machine qemu-6-instance-00000006.
Jan 26 15:11:33 compute-1 systemd-udevd[206421]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 15:11:33 compute-1 NetworkManager[55716]: <info>  [1769440293.7615] device (tap2fe6de1e-43): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 15:11:33 compute-1 NetworkManager[55716]: <info>  [1769440293.7621] device (tap2fe6de1e-43): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 15:11:33 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:11:33.825 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a4:ff:b6 10.100.0.7'], port_security=['fa:16:3e:a4:ff:b6 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '6532f73e-4d35-42af-b257-7ddf3cd08929', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d4a37c9f-5b64-4f94-80e9-126c911b1acf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6377892a338d4a7cbe63cf30bd2c63ea', 'neutron:revision_number': '10', 'neutron:security_group_ids': '6ec487f2-f407-43f7-8fd3-02f4d5e73158', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=922d1c2a-bc46-47ee-81d5-242719303ef7, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[], logical_port=2fe6de1e-4369-4693-b470-ed7d9e9f51ef) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:11:33 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:11:33.826 104930 INFO neutron.agent.ovn.metadata.agent [-] Port 2fe6de1e-4369-4693-b470-ed7d9e9f51ef in datapath d4a37c9f-5b64-4f94-80e9-126c911b1acf unbound from our chassis
Jan 26 15:11:33 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:11:33.827 104930 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d4a37c9f-5b64-4f94-80e9-126c911b1acf
Jan 26 15:11:33 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:11:33.848 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[ba78a2bd-6fc6-473a-8fa3-809df9f582cc]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:11:33 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:11:33.886 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[a46edbc8-a0f0-41e2-beb4-8b0d61292153]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:11:33 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:11:33.889 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[626057bc-a420-482b-b155-82521db15695]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:11:33 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:11:33.926 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[566c83cf-2563-4e07-b4a8-e42be293c554]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:11:33 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:11:33.946 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[fea42a87-0466-4fe1-a5b6-bd2081378e59]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd4a37c9f-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:53:55:45'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 14, 'tx_packets': 11, 'rx_bytes': 868, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 14, 'tx_packets': 11, 'rx_bytes': 868, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 385314, 'reachable_time': 28113, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 206435, 'error': None, 'target': 'ovnmeta-d4a37c9f-5b64-4f94-80e9-126c911b1acf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:11:33 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:11:33.965 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[f7334432-bbf0-4c49-8c3b-53c612b33be1]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd4a37c9f-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 385324, 'tstamp': 385324}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 206436, 'error': None, 'target': 'ovnmeta-d4a37c9f-5b64-4f94-80e9-126c911b1acf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd4a37c9f-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 385327, 'tstamp': 385327}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 206436, 'error': None, 'target': 'ovnmeta-d4a37c9f-5b64-4f94-80e9-126c911b1acf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:11:33 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:11:33.967 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd4a37c9f-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:11:33 compute-1 nova_compute[183403]: 2026-01-26 15:11:33.969 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:11:33 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:11:33.970 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd4a37c9f-50, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:11:33 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:11:33.970 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:11:33 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:11:33.971 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd4a37c9f-50, col_values=(('external_ids', {'iface-id': '3415b7f1-5b64-48d1-b20f-4c68422efc0e'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:11:33 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:11:33.971 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:11:33 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:11:33.972 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[4a25f209-4db2-415b-8862-3b6c3d8a92e6]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-d4a37c9f-5b64-4f94-80e9-126c911b1acf\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/d4a37c9f-5b64-4f94-80e9-126c911b1acf.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID d4a37c9f-5b64-4f94-80e9-126c911b1acf\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:11:34 compute-1 nova_compute[183403]: 2026-01-26 15:11:34.419 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:11:35 compute-1 nova_compute[183403]: 2026-01-26 15:11:35.107 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:11:35 compute-1 podman[192725]: time="2026-01-26T15:11:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:11:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:11:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16574 "" "Go-http-client/1.1"
Jan 26 15:11:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:11:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2637 "" "Go-http-client/1.1"
Jan 26 15:11:36 compute-1 nova_compute[183403]: 2026-01-26 15:11:36.556 183407 DEBUG oslo_concurrency.lockutils [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "refresh_cache-7484f2cd-93e8-4578-9c4a-5bc1e0b49d10" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 15:11:36 compute-1 nova_compute[183403]: 2026-01-26 15:11:36.557 183407 DEBUG oslo_concurrency.lockutils [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquired lock "refresh_cache-7484f2cd-93e8-4578-9c4a-5bc1e0b49d10" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 15:11:36 compute-1 nova_compute[183403]: 2026-01-26 15:11:36.557 183407 DEBUG nova.network.neutron [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 15:11:37 compute-1 nova_compute[183403]: 2026-01-26 15:11:37.193 183407 DEBUG nova.compute.manager [req-9ed7fd4d-babf-461e-a62d-a6c3f610b601 req-3cae8a7f-ce3d-4f30-ba0f-bbddb3f9efc2 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10] Received event network-changed-2282f681-f2e6-4601-a493-3244e752f7a4 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:11:37 compute-1 nova_compute[183403]: 2026-01-26 15:11:37.194 183407 DEBUG nova.compute.manager [req-9ed7fd4d-babf-461e-a62d-a6c3f610b601 req-3cae8a7f-ce3d-4f30-ba0f-bbddb3f9efc2 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10] Refreshing instance network info cache due to event network-changed-2282f681-f2e6-4601-a493-3244e752f7a4. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 26 15:11:37 compute-1 nova_compute[183403]: 2026-01-26 15:11:37.194 183407 DEBUG oslo_concurrency.lockutils [req-9ed7fd4d-babf-461e-a62d-a6c3f610b601 req-3cae8a7f-ce3d-4f30-ba0f-bbddb3f9efc2 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "refresh_cache-7484f2cd-93e8-4578-9c4a-5bc1e0b49d10" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 15:11:37 compute-1 ovn_controller[95641]: 2026-01-26T15:11:37Z|00069|binding|INFO|Claiming lport 2fe6de1e-4369-4693-b470-ed7d9e9f51ef for this chassis.
Jan 26 15:11:37 compute-1 ovn_controller[95641]: 2026-01-26T15:11:37Z|00070|binding|INFO|2fe6de1e-4369-4693-b470-ed7d9e9f51ef: Claiming fa:16:3e:a4:ff:b6 10.100.0.7
Jan 26 15:11:37 compute-1 ovn_controller[95641]: 2026-01-26T15:11:37Z|00071|binding|INFO|Setting lport 2fe6de1e-4369-4693-b470-ed7d9e9f51ef up in Southbound
Jan 26 15:11:37 compute-1 nova_compute[183403]: 2026-01-26 15:11:37.683 183407 WARNING neutronclient.v2_0.client [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:11:37 compute-1 systemd[1]: Stopping User Manager for UID 42436...
Jan 26 15:11:37 compute-1 systemd[206268]: Activating special unit Exit the Session...
Jan 26 15:11:37 compute-1 systemd[206268]: Stopped target Main User Target.
Jan 26 15:11:37 compute-1 systemd[206268]: Stopped target Basic System.
Jan 26 15:11:37 compute-1 systemd[206268]: Stopped target Paths.
Jan 26 15:11:37 compute-1 systemd[206268]: Stopped target Sockets.
Jan 26 15:11:37 compute-1 systemd[206268]: Stopped target Timers.
Jan 26 15:11:37 compute-1 systemd[206268]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 26 15:11:37 compute-1 systemd[206268]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 26 15:11:37 compute-1 systemd[206268]: Closed D-Bus User Message Bus Socket.
Jan 26 15:11:37 compute-1 systemd[206268]: Stopped Create User's Volatile Files and Directories.
Jan 26 15:11:37 compute-1 systemd[206268]: Removed slice User Application Slice.
Jan 26 15:11:37 compute-1 systemd[206268]: Reached target Shutdown.
Jan 26 15:11:37 compute-1 systemd[206268]: Finished Exit the Session.
Jan 26 15:11:37 compute-1 systemd[206268]: Reached target Exit the Session.
Jan 26 15:11:37 compute-1 systemd[1]: user@42436.service: Deactivated successfully.
Jan 26 15:11:37 compute-1 systemd[1]: Stopped User Manager for UID 42436.
Jan 26 15:11:37 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 26 15:11:37 compute-1 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 26 15:11:37 compute-1 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 26 15:11:37 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 26 15:11:37 compute-1 systemd[1]: Removed slice User Slice of UID 42436.
Jan 26 15:11:38 compute-1 nova_compute[183403]: 2026-01-26 15:11:38.060 183407 WARNING neutronclient.v2_0.client [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:11:38 compute-1 nova_compute[183403]: 2026-01-26 15:11:38.276 183407 DEBUG nova.network.neutron [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10] Updating instance_info_cache with network_info: [{"id": "2282f681-f2e6-4601-a493-3244e752f7a4", "address": "fa:16:3e:29:1b:91", "network": {"id": "d4a37c9f-5b64-4f94-80e9-126c911b1acf", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7dc5e9070a084dfcb543a08e87868f39", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2282f681-f2", "ovs_interfaceid": "2282f681-f2e6-4601-a493-3244e752f7a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:11:38 compute-1 nova_compute[183403]: 2026-01-26 15:11:38.666 183407 INFO nova.compute.manager [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 6532f73e-4d35-42af-b257-7ddf3cd08929] Post operation of migration started
Jan 26 15:11:38 compute-1 nova_compute[183403]: 2026-01-26 15:11:38.667 183407 WARNING neutronclient.v2_0.client [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:11:38 compute-1 nova_compute[183403]: 2026-01-26 15:11:38.782 183407 DEBUG oslo_concurrency.lockutils [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Releasing lock "refresh_cache-7484f2cd-93e8-4578-9c4a-5bc1e0b49d10" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 15:11:38 compute-1 nova_compute[183403]: 2026-01-26 15:11:38.786 183407 DEBUG oslo_concurrency.lockutils [req-9ed7fd4d-babf-461e-a62d-a6c3f610b601 req-3cae8a7f-ce3d-4f30-ba0f-bbddb3f9efc2 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquired lock "refresh_cache-7484f2cd-93e8-4578-9c4a-5bc1e0b49d10" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 15:11:38 compute-1 nova_compute[183403]: 2026-01-26 15:11:38.787 183407 DEBUG nova.network.neutron [req-9ed7fd4d-babf-461e-a62d-a6c3f610b601 req-3cae8a7f-ce3d-4f30-ba0f-bbddb3f9efc2 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10] Refreshing network info cache for port 2282f681-f2e6-4601-a493-3244e752f7a4 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 26 15:11:39 compute-1 nova_compute[183403]: 2026-01-26 15:11:39.090 183407 WARNING neutronclient.v2_0.client [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:11:39 compute-1 nova_compute[183403]: 2026-01-26 15:11:39.091 183407 WARNING neutronclient.v2_0.client [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:11:39 compute-1 nova_compute[183403]: 2026-01-26 15:11:39.175 183407 DEBUG oslo_concurrency.lockutils [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "refresh_cache-6532f73e-4d35-42af-b257-7ddf3cd08929" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 15:11:39 compute-1 nova_compute[183403]: 2026-01-26 15:11:39.176 183407 DEBUG oslo_concurrency.lockutils [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquired lock "refresh_cache-6532f73e-4d35-42af-b257-7ddf3cd08929" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 15:11:39 compute-1 nova_compute[183403]: 2026-01-26 15:11:39.176 183407 DEBUG nova.network.neutron [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 6532f73e-4d35-42af-b257-7ddf3cd08929] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 15:11:39 compute-1 nova_compute[183403]: 2026-01-26 15:11:39.295 183407 WARNING neutronclient.v2_0.client [req-9ed7fd4d-babf-461e-a62d-a6c3f610b601 req-3cae8a7f-ce3d-4f30-ba0f-bbddb3f9efc2 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:11:39 compute-1 nova_compute[183403]: 2026-01-26 15:11:39.343 183407 DEBUG nova.virt.libvirt.driver [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10] Starting finish_migration finish_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12604
Jan 26 15:11:39 compute-1 nova_compute[183403]: 2026-01-26 15:11:39.348 183407 DEBUG nova.virt.libvirt.driver [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10] Instance directory exists: not creating _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5134
Jan 26 15:11:39 compute-1 nova_compute[183403]: 2026-01-26 15:11:39.349 183407 INFO nova.virt.libvirt.driver [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10] Creating image(s)
Jan 26 15:11:39 compute-1 nova_compute[183403]: 2026-01-26 15:11:39.351 183407 DEBUG nova.objects.instance [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lazy-loading 'trusted_certs' on Instance uuid 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:11:39 compute-1 nova_compute[183403]: 2026-01-26 15:11:39.453 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:11:39 compute-1 nova_compute[183403]: 2026-01-26 15:11:39.656 183407 WARNING neutronclient.v2_0.client [req-9ed7fd4d-babf-461e-a62d-a6c3f610b601 req-3cae8a7f-ce3d-4f30-ba0f-bbddb3f9efc2 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:11:39 compute-1 nova_compute[183403]: 2026-01-26 15:11:39.683 183407 WARNING neutronclient.v2_0.client [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:11:39 compute-1 nova_compute[183403]: 2026-01-26 15:11:39.859 183407 DEBUG oslo_concurrency.processutils [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:11:39 compute-1 nova_compute[183403]: 2026-01-26 15:11:39.943 183407 DEBUG oslo_concurrency.processutils [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:11:39 compute-1 nova_compute[183403]: 2026-01-26 15:11:39.944 183407 DEBUG nova.virt.disk.api [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Checking if we can resize image /var/lib/nova/instances/7484f2cd-93e8-4578-9c4a-5bc1e0b49d10/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 26 15:11:39 compute-1 nova_compute[183403]: 2026-01-26 15:11:39.944 183407 DEBUG oslo_concurrency.processutils [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7484f2cd-93e8-4578-9c4a-5bc1e0b49d10/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:11:40 compute-1 nova_compute[183403]: 2026-01-26 15:11:40.006 183407 DEBUG oslo_concurrency.processutils [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7484f2cd-93e8-4578-9c4a-5bc1e0b49d10/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:11:40 compute-1 nova_compute[183403]: 2026-01-26 15:11:40.007 183407 DEBUG nova.virt.disk.api [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Cannot resize image /var/lib/nova/instances/7484f2cd-93e8-4578-9c4a-5bc1e0b49d10/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 26 15:11:40 compute-1 nova_compute[183403]: 2026-01-26 15:11:40.101 183407 DEBUG nova.network.neutron [req-9ed7fd4d-babf-461e-a62d-a6c3f610b601 req-3cae8a7f-ce3d-4f30-ba0f-bbddb3f9efc2 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10] Updated VIF entry in instance network info cache for port 2282f681-f2e6-4601-a493-3244e752f7a4. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Jan 26 15:11:40 compute-1 nova_compute[183403]: 2026-01-26 15:11:40.102 183407 DEBUG nova.network.neutron [req-9ed7fd4d-babf-461e-a62d-a6c3f610b601 req-3cae8a7f-ce3d-4f30-ba0f-bbddb3f9efc2 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10] Updating instance_info_cache with network_info: [{"id": "2282f681-f2e6-4601-a493-3244e752f7a4", "address": "fa:16:3e:29:1b:91", "network": {"id": "d4a37c9f-5b64-4f94-80e9-126c911b1acf", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7dc5e9070a084dfcb543a08e87868f39", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2282f681-f2", "ovs_interfaceid": "2282f681-f2e6-4601-a493-3244e752f7a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:11:40 compute-1 nova_compute[183403]: 2026-01-26 15:11:40.110 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:11:40 compute-1 nova_compute[183403]: 2026-01-26 15:11:40.436 183407 WARNING neutronclient.v2_0.client [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:11:40 compute-1 nova_compute[183403]: 2026-01-26 15:11:40.593 183407 DEBUG nova.virt.libvirt.driver [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10] Did not create local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5272
Jan 26 15:11:40 compute-1 nova_compute[183403]: 2026-01-26 15:11:40.593 183407 DEBUG nova.virt.libvirt.driver [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10] Ensure instance console log exists: /var/lib/nova/instances/7484f2cd-93e8-4578-9c4a-5bc1e0b49d10/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Jan 26 15:11:40 compute-1 nova_compute[183403]: 2026-01-26 15:11:40.594 183407 DEBUG oslo_concurrency.lockutils [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:11:40 compute-1 nova_compute[183403]: 2026-01-26 15:11:40.594 183407 DEBUG oslo_concurrency.lockutils [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:11:40 compute-1 nova_compute[183403]: 2026-01-26 15:11:40.594 183407 DEBUG oslo_concurrency.lockutils [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:11:40 compute-1 nova_compute[183403]: 2026-01-26 15:11:40.597 183407 DEBUG nova.virt.libvirt.driver [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10] Start _get_guest_xml network_info=[{"id": "2282f681-f2e6-4601-a493-3244e752f7a4", "address": "fa:16:3e:29:1b:91", "network": {"id": "d4a37c9f-5b64-4f94-80e9-126c911b1acf", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "vif_mac": "fa:16:3e:29:1b:91"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7dc5e9070a084dfcb543a08e87868f39", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2282f681-f2", "ovs_interfaceid": "2282f681-f2e6-4601-a493-3244e752f7a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T15:01:22Z,direct_url=<?>,disk_format='qcow2',id=354e4d0e-4287-404f-93d3-2c85cfe92fbc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='179f3c996d8f4e7ea1b0aca3ec76f02e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T15:01:23Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'image_id': '354e4d0e-4287-404f-93d3-2c85cfe92fbc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Jan 26 15:11:40 compute-1 nova_compute[183403]: 2026-01-26 15:11:40.603 183407 WARNING nova.virt.libvirt.driver [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:11:40 compute-1 nova_compute[183403]: 2026-01-26 15:11:40.605 183407 DEBUG nova.virt.driver [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='354e4d0e-4287-404f-93d3-2c85cfe92fbc', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteActionsViaActuator-server-1496374934', uuid='7484f2cd-93e8-4578-9c4a-5bc1e0b49d10'), owner=OwnerMeta(userid='afb4f4811cb043dca89a8413c390ba3d', username='tempest-TestExecuteActionsViaActuator-280856547-project-admin', projectid='6377892a338d4a7cbe63cf30bd2c63ea', projectname='tempest-TestExecuteActionsViaActuator-280856547'), image=ImageMeta(id='354e4d0e-4287-404f-93d3-2c85cfe92fbc', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_cdrom_bus': 'sata', 'hw_disk_bus': 'virtio', 'hw_input_bus': 'usb', 'hw_machine_type': 'q35', 'hw_pointer_model': 'usbtablet', 'hw_rng_model': 'virtio', 'hw_video_model': 'virtio', 'hw_vif_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='74480e15-23e6-4569-8ef9-3ddf5ac8b981', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "2282f681-f2e6-4601-a493-3244e752f7a4", "address": "fa:16:3e:29:1b:91", "network": {"id": "d4a37c9f-5b64-4f94-80e9-126c911b1acf", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "vif_mac": "fa:16:3e:29:1b:91"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7dc5e9070a084dfcb543a08e87868f39", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2282f681-f2", "ovs_interfaceid": "2282f681-f2e6-4601-a493-3244e752f7a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1769440300.6058433) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Jan 26 15:11:40 compute-1 nova_compute[183403]: 2026-01-26 15:11:40.608 183407 DEBUG nova.network.neutron [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 6532f73e-4d35-42af-b257-7ddf3cd08929] Updating instance_info_cache with network_info: [{"id": "2fe6de1e-4369-4693-b470-ed7d9e9f51ef", "address": "fa:16:3e:a4:ff:b6", "network": {"id": "d4a37c9f-5b64-4f94-80e9-126c911b1acf", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7dc5e9070a084dfcb543a08e87868f39", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fe6de1e-43", "ovs_interfaceid": "2fe6de1e-4369-4693-b470-ed7d9e9f51ef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:11:40 compute-1 nova_compute[183403]: 2026-01-26 15:11:40.610 183407 DEBUG oslo_concurrency.lockutils [req-9ed7fd4d-babf-461e-a62d-a6c3f610b601 req-3cae8a7f-ce3d-4f30-ba0f-bbddb3f9efc2 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Releasing lock "refresh_cache-7484f2cd-93e8-4578-9c4a-5bc1e0b49d10" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 15:11:40 compute-1 nova_compute[183403]: 2026-01-26 15:11:40.615 183407 DEBUG nova.virt.libvirt.host [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Jan 26 15:11:40 compute-1 nova_compute[183403]: 2026-01-26 15:11:40.615 183407 DEBUG nova.virt.libvirt.host [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Jan 26 15:11:40 compute-1 nova_compute[183403]: 2026-01-26 15:11:40.620 183407 DEBUG nova.virt.libvirt.host [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Jan 26 15:11:40 compute-1 nova_compute[183403]: 2026-01-26 15:11:40.621 183407 DEBUG nova.virt.libvirt.host [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Jan 26 15:11:40 compute-1 nova_compute[183403]: 2026-01-26 15:11:40.623 183407 DEBUG nova.virt.libvirt.driver [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Jan 26 15:11:40 compute-1 nova_compute[183403]: 2026-01-26 15:11:40.623 183407 DEBUG nova.virt.hardware [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T15:01:21Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='74480e15-23e6-4569-8ef9-3ddf5ac8b981',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T15:01:22Z,direct_url=<?>,disk_format='qcow2',id=354e4d0e-4287-404f-93d3-2c85cfe92fbc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='179f3c996d8f4e7ea1b0aca3ec76f02e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T15:01:23Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Jan 26 15:11:40 compute-1 nova_compute[183403]: 2026-01-26 15:11:40.624 183407 DEBUG nova.virt.hardware [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Jan 26 15:11:40 compute-1 nova_compute[183403]: 2026-01-26 15:11:40.624 183407 DEBUG nova.virt.hardware [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Jan 26 15:11:40 compute-1 nova_compute[183403]: 2026-01-26 15:11:40.625 183407 DEBUG nova.virt.hardware [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Jan 26 15:11:40 compute-1 nova_compute[183403]: 2026-01-26 15:11:40.625 183407 DEBUG nova.virt.hardware [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Jan 26 15:11:40 compute-1 nova_compute[183403]: 2026-01-26 15:11:40.625 183407 DEBUG nova.virt.hardware [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Jan 26 15:11:40 compute-1 nova_compute[183403]: 2026-01-26 15:11:40.625 183407 DEBUG nova.virt.hardware [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Jan 26 15:11:40 compute-1 nova_compute[183403]: 2026-01-26 15:11:40.626 183407 DEBUG nova.virt.hardware [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Jan 26 15:11:40 compute-1 nova_compute[183403]: 2026-01-26 15:11:40.626 183407 DEBUG nova.virt.hardware [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Jan 26 15:11:40 compute-1 nova_compute[183403]: 2026-01-26 15:11:40.626 183407 DEBUG nova.virt.hardware [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Jan 26 15:11:40 compute-1 nova_compute[183403]: 2026-01-26 15:11:40.626 183407 DEBUG nova.virt.hardware [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Jan 26 15:11:40 compute-1 nova_compute[183403]: 2026-01-26 15:11:40.627 183407 DEBUG nova.objects.instance [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lazy-loading 'vcpu_model' on Instance uuid 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:11:41 compute-1 nova_compute[183403]: 2026-01-26 15:11:41.183 183407 DEBUG oslo_concurrency.lockutils [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Releasing lock "refresh_cache-6532f73e-4d35-42af-b257-7ddf3cd08929" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 15:11:41 compute-1 nova_compute[183403]: 2026-01-26 15:11:41.185 183407 DEBUG nova.objects.base [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Object Instance<7484f2cd-93e8-4578-9c4a-5bc1e0b49d10> lazy-loaded attributes: trusted_certs,vcpu_model wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Jan 26 15:11:41 compute-1 nova_compute[183403]: 2026-01-26 15:11:41.189 183407 DEBUG oslo_concurrency.processutils [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7484f2cd-93e8-4578-9c4a-5bc1e0b49d10/disk.config --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:11:41 compute-1 nova_compute[183403]: 2026-01-26 15:11:41.252 183407 DEBUG oslo_concurrency.processutils [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7484f2cd-93e8-4578-9c4a-5bc1e0b49d10/disk.config --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:11:41 compute-1 nova_compute[183403]: 2026-01-26 15:11:41.253 183407 DEBUG oslo_concurrency.lockutils [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "/var/lib/nova/instances/7484f2cd-93e8-4578-9c4a-5bc1e0b49d10/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:11:41 compute-1 nova_compute[183403]: 2026-01-26 15:11:41.253 183407 DEBUG oslo_concurrency.lockutils [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "/var/lib/nova/instances/7484f2cd-93e8-4578-9c4a-5bc1e0b49d10/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:11:41 compute-1 nova_compute[183403]: 2026-01-26 15:11:41.254 183407 DEBUG oslo_concurrency.lockutils [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "/var/lib/nova/instances/7484f2cd-93e8-4578-9c4a-5bc1e0b49d10/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:11:41 compute-1 nova_compute[183403]: 2026-01-26 15:11:41.255 183407 DEBUG nova.virt.libvirt.vif [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2026-01-26T15:10:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1496374934',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1496374934',id=8,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T15:10:42Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6377892a338d4a7cbe63cf30bd2c63ea',ramdisk_id='',reservation_id='r-2b7ioems',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='manager,reader,member,admin',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-280856547',owner_user_name='tempest-TestExecuteActionsViaActuator-280856547-project-admin'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T15:11:28Z,user_data=None,user_id='afb4f4811cb043dca89a8413c390ba3d',uuid=7484f2cd-93e8-4578-9c4a-5bc1e0b49d10,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2282f681-f2e6-4601-a493-3244e752f7a4", "address": "fa:16:3e:29:1b:91", "network": {"id": "d4a37c9f-5b64-4f94-80e9-126c911b1acf", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "vif_mac": "fa:16:3e:29:1b:91"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7dc5e9070a084dfcb543a08e87868f39", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2282f681-f2", "ovs_interfaceid": "2282f681-f2e6-4601-a493-3244e752f7a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 26 15:11:41 compute-1 nova_compute[183403]: 2026-01-26 15:11:41.256 183407 DEBUG nova.network.os_vif_util [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Converting VIF {"id": "2282f681-f2e6-4601-a493-3244e752f7a4", "address": "fa:16:3e:29:1b:91", "network": {"id": "d4a37c9f-5b64-4f94-80e9-126c911b1acf", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "vif_mac": "fa:16:3e:29:1b:91"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7dc5e9070a084dfcb543a08e87868f39", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2282f681-f2", "ovs_interfaceid": "2282f681-f2e6-4601-a493-3244e752f7a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:11:41 compute-1 nova_compute[183403]: 2026-01-26 15:11:41.256 183407 DEBUG nova.network.os_vif_util [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:29:1b:91,bridge_name='br-int',has_traffic_filtering=True,id=2282f681-f2e6-4601-a493-3244e752f7a4,network=Network(d4a37c9f-5b64-4f94-80e9-126c911b1acf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2282f681-f2') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:11:41 compute-1 nova_compute[183403]: 2026-01-26 15:11:41.259 183407 DEBUG nova.virt.libvirt.driver [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10] End _get_guest_xml xml=<domain type="kvm">
Jan 26 15:11:41 compute-1 nova_compute[183403]:   <uuid>7484f2cd-93e8-4578-9c4a-5bc1e0b49d10</uuid>
Jan 26 15:11:41 compute-1 nova_compute[183403]:   <name>instance-00000008</name>
Jan 26 15:11:41 compute-1 nova_compute[183403]:   <memory>131072</memory>
Jan 26 15:11:41 compute-1 nova_compute[183403]:   <vcpu>1</vcpu>
Jan 26 15:11:41 compute-1 nova_compute[183403]:   <metadata>
Jan 26 15:11:41 compute-1 nova_compute[183403]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 15:11:41 compute-1 nova_compute[183403]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 15:11:41 compute-1 nova_compute[183403]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-1496374934</nova:name>
Jan 26 15:11:41 compute-1 nova_compute[183403]:       <nova:creationTime>2026-01-26 15:11:40</nova:creationTime>
Jan 26 15:11:41 compute-1 nova_compute[183403]:       <nova:flavor name="m1.nano" id="74480e15-23e6-4569-8ef9-3ddf5ac8b981">
Jan 26 15:11:41 compute-1 nova_compute[183403]:         <nova:memory>128</nova:memory>
Jan 26 15:11:41 compute-1 nova_compute[183403]:         <nova:disk>1</nova:disk>
Jan 26 15:11:41 compute-1 nova_compute[183403]:         <nova:swap>0</nova:swap>
Jan 26 15:11:41 compute-1 nova_compute[183403]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 15:11:41 compute-1 nova_compute[183403]:         <nova:vcpus>1</nova:vcpus>
Jan 26 15:11:41 compute-1 nova_compute[183403]:         <nova:extraSpecs>
Jan 26 15:11:41 compute-1 nova_compute[183403]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 15:11:41 compute-1 nova_compute[183403]:         </nova:extraSpecs>
Jan 26 15:11:41 compute-1 nova_compute[183403]:       </nova:flavor>
Jan 26 15:11:41 compute-1 nova_compute[183403]:       <nova:image uuid="354e4d0e-4287-404f-93d3-2c85cfe92fbc">
Jan 26 15:11:41 compute-1 nova_compute[183403]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 15:11:41 compute-1 nova_compute[183403]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 15:11:41 compute-1 nova_compute[183403]:         <nova:minDisk>1</nova:minDisk>
Jan 26 15:11:41 compute-1 nova_compute[183403]:         <nova:minRam>0</nova:minRam>
Jan 26 15:11:41 compute-1 nova_compute[183403]:         <nova:properties>
Jan 26 15:11:41 compute-1 nova_compute[183403]:           <nova:property name="hw_cdrom_bus">sata</nova:property>
Jan 26 15:11:41 compute-1 nova_compute[183403]:           <nova:property name="hw_disk_bus">virtio</nova:property>
Jan 26 15:11:41 compute-1 nova_compute[183403]:           <nova:property name="hw_input_bus">usb</nova:property>
Jan 26 15:11:41 compute-1 nova_compute[183403]:           <nova:property name="hw_machine_type">q35</nova:property>
Jan 26 15:11:41 compute-1 nova_compute[183403]:           <nova:property name="hw_pointer_model">usbtablet</nova:property>
Jan 26 15:11:41 compute-1 nova_compute[183403]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 15:11:41 compute-1 nova_compute[183403]:           <nova:property name="hw_video_model">virtio</nova:property>
Jan 26 15:11:41 compute-1 nova_compute[183403]:           <nova:property name="hw_vif_model">virtio</nova:property>
Jan 26 15:11:41 compute-1 nova_compute[183403]:         </nova:properties>
Jan 26 15:11:41 compute-1 nova_compute[183403]:       </nova:image>
Jan 26 15:11:41 compute-1 nova_compute[183403]:       <nova:owner>
Jan 26 15:11:41 compute-1 nova_compute[183403]:         <nova:user uuid="afb4f4811cb043dca89a8413c390ba3d">tempest-TestExecuteActionsViaActuator-280856547-project-admin</nova:user>
Jan 26 15:11:41 compute-1 nova_compute[183403]:         <nova:project uuid="6377892a338d4a7cbe63cf30bd2c63ea">tempest-TestExecuteActionsViaActuator-280856547</nova:project>
Jan 26 15:11:41 compute-1 nova_compute[183403]:       </nova:owner>
Jan 26 15:11:41 compute-1 nova_compute[183403]:       <nova:root type="image" uuid="354e4d0e-4287-404f-93d3-2c85cfe92fbc"/>
Jan 26 15:11:41 compute-1 nova_compute[183403]:       <nova:ports>
Jan 26 15:11:41 compute-1 nova_compute[183403]:         <nova:port uuid="2282f681-f2e6-4601-a493-3244e752f7a4">
Jan 26 15:11:41 compute-1 nova_compute[183403]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 26 15:11:41 compute-1 nova_compute[183403]:         </nova:port>
Jan 26 15:11:41 compute-1 nova_compute[183403]:       </nova:ports>
Jan 26 15:11:41 compute-1 nova_compute[183403]:     </nova:instance>
Jan 26 15:11:41 compute-1 nova_compute[183403]:   </metadata>
Jan 26 15:11:41 compute-1 nova_compute[183403]:   <sysinfo type="smbios">
Jan 26 15:11:41 compute-1 nova_compute[183403]:     <system>
Jan 26 15:11:41 compute-1 nova_compute[183403]:       <entry name="manufacturer">RDO</entry>
Jan 26 15:11:41 compute-1 nova_compute[183403]:       <entry name="product">OpenStack Compute</entry>
Jan 26 15:11:41 compute-1 nova_compute[183403]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 15:11:41 compute-1 nova_compute[183403]:       <entry name="serial">7484f2cd-93e8-4578-9c4a-5bc1e0b49d10</entry>
Jan 26 15:11:41 compute-1 nova_compute[183403]:       <entry name="uuid">7484f2cd-93e8-4578-9c4a-5bc1e0b49d10</entry>
Jan 26 15:11:41 compute-1 nova_compute[183403]:       <entry name="family">Virtual Machine</entry>
Jan 26 15:11:41 compute-1 nova_compute[183403]:     </system>
Jan 26 15:11:41 compute-1 nova_compute[183403]:   </sysinfo>
Jan 26 15:11:41 compute-1 nova_compute[183403]:   <os>
Jan 26 15:11:41 compute-1 nova_compute[183403]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 15:11:41 compute-1 nova_compute[183403]:     <boot dev="hd"/>
Jan 26 15:11:41 compute-1 nova_compute[183403]:     <smbios mode="sysinfo"/>
Jan 26 15:11:41 compute-1 nova_compute[183403]:   </os>
Jan 26 15:11:41 compute-1 nova_compute[183403]:   <features>
Jan 26 15:11:41 compute-1 nova_compute[183403]:     <acpi/>
Jan 26 15:11:41 compute-1 nova_compute[183403]:     <apic/>
Jan 26 15:11:41 compute-1 nova_compute[183403]:     <vmcoreinfo/>
Jan 26 15:11:41 compute-1 nova_compute[183403]:   </features>
Jan 26 15:11:41 compute-1 nova_compute[183403]:   <clock offset="utc">
Jan 26 15:11:41 compute-1 nova_compute[183403]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 15:11:41 compute-1 nova_compute[183403]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 15:11:41 compute-1 nova_compute[183403]:     <timer name="hpet" present="no"/>
Jan 26 15:11:41 compute-1 nova_compute[183403]:   </clock>
Jan 26 15:11:41 compute-1 nova_compute[183403]:   <cpu mode="custom" match="exact">
Jan 26 15:11:41 compute-1 nova_compute[183403]:     <model>Nehalem</model>
Jan 26 15:11:41 compute-1 nova_compute[183403]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 15:11:41 compute-1 nova_compute[183403]:   </cpu>
Jan 26 15:11:41 compute-1 nova_compute[183403]:   <devices>
Jan 26 15:11:41 compute-1 nova_compute[183403]:     <disk type="file" device="disk">
Jan 26 15:11:41 compute-1 nova_compute[183403]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 15:11:41 compute-1 nova_compute[183403]:       <source file="/var/lib/nova/instances/7484f2cd-93e8-4578-9c4a-5bc1e0b49d10/disk"/>
Jan 26 15:11:41 compute-1 nova_compute[183403]:       <target dev="vda" bus="virtio"/>
Jan 26 15:11:41 compute-1 nova_compute[183403]:     </disk>
Jan 26 15:11:41 compute-1 nova_compute[183403]:     <disk type="file" device="cdrom">
Jan 26 15:11:41 compute-1 nova_compute[183403]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 15:11:41 compute-1 nova_compute[183403]:       <source file="/var/lib/nova/instances/7484f2cd-93e8-4578-9c4a-5bc1e0b49d10/disk.config"/>
Jan 26 15:11:41 compute-1 nova_compute[183403]:       <target dev="sda" bus="sata"/>
Jan 26 15:11:41 compute-1 nova_compute[183403]:     </disk>
Jan 26 15:11:41 compute-1 nova_compute[183403]:     <interface type="ethernet">
Jan 26 15:11:41 compute-1 nova_compute[183403]:       <mac address="fa:16:3e:29:1b:91"/>
Jan 26 15:11:41 compute-1 nova_compute[183403]:       <model type="virtio"/>
Jan 26 15:11:41 compute-1 nova_compute[183403]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 15:11:41 compute-1 nova_compute[183403]:       <mtu size="1442"/>
Jan 26 15:11:41 compute-1 nova_compute[183403]:       <target dev="tap2282f681-f2"/>
Jan 26 15:11:41 compute-1 nova_compute[183403]:     </interface>
Jan 26 15:11:41 compute-1 nova_compute[183403]:     <serial type="pty">
Jan 26 15:11:41 compute-1 nova_compute[183403]:       <log file="/var/lib/nova/instances/7484f2cd-93e8-4578-9c4a-5bc1e0b49d10/console.log" append="off"/>
Jan 26 15:11:41 compute-1 nova_compute[183403]:     </serial>
Jan 26 15:11:41 compute-1 nova_compute[183403]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 15:11:41 compute-1 nova_compute[183403]:     <video>
Jan 26 15:11:41 compute-1 nova_compute[183403]:       <model type="virtio"/>
Jan 26 15:11:41 compute-1 nova_compute[183403]:     </video>
Jan 26 15:11:41 compute-1 nova_compute[183403]:     <input type="tablet" bus="usb"/>
Jan 26 15:11:41 compute-1 nova_compute[183403]:     <rng model="virtio">
Jan 26 15:11:41 compute-1 nova_compute[183403]:       <backend model="random">/dev/urandom</backend>
Jan 26 15:11:41 compute-1 nova_compute[183403]:     </rng>
Jan 26 15:11:41 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root"/>
Jan 26 15:11:41 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:11:41 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:11:41 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:11:41 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:11:41 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:11:41 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:11:41 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:11:41 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:11:41 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:11:41 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:11:41 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:11:41 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:11:41 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:11:41 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:11:41 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:11:41 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:11:41 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:11:41 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:11:41 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:11:41 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:11:41 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:11:41 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:11:41 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:11:41 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:11:41 compute-1 nova_compute[183403]:     <controller type="usb" index="0"/>
Jan 26 15:11:41 compute-1 nova_compute[183403]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 15:11:41 compute-1 nova_compute[183403]:       <stats period="10"/>
Jan 26 15:11:41 compute-1 nova_compute[183403]:     </memballoon>
Jan 26 15:11:41 compute-1 nova_compute[183403]:   </devices>
Jan 26 15:11:41 compute-1 nova_compute[183403]: </domain>
Jan 26 15:11:41 compute-1 nova_compute[183403]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Jan 26 15:11:41 compute-1 nova_compute[183403]: 2026-01-26 15:11:41.260 183407 DEBUG nova.virt.libvirt.vif [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2026-01-26T15:10:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1496374934',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1496374934',id=8,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T15:10:42Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6377892a338d4a7cbe63cf30bd2c63ea',ramdisk_id='',reservation_id='r-2b7ioems',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='manager,reader,member,admin',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-280856547',owner_user_name='tempest-TestExecuteActionsViaActuator-280856547-project-admin'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T15:11:28Z,user_data=None,user_id='afb4f4811cb043dca89a8413c390ba3d',uuid=7484f2cd-93e8-4578-9c4a-5bc1e0b49d10,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2282f681-f2e6-4601-a493-3244e752f7a4", "address": "fa:16:3e:29:1b:91", "network": {"id": "d4a37c9f-5b64-4f94-80e9-126c911b1acf", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "vif_mac": "fa:16:3e:29:1b:91"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7dc5e9070a084dfcb543a08e87868f39", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2282f681-f2", "ovs_interfaceid": "2282f681-f2e6-4601-a493-3244e752f7a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 26 15:11:41 compute-1 nova_compute[183403]: 2026-01-26 15:11:41.260 183407 DEBUG nova.network.os_vif_util [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Converting VIF {"id": "2282f681-f2e6-4601-a493-3244e752f7a4", "address": "fa:16:3e:29:1b:91", "network": {"id": "d4a37c9f-5b64-4f94-80e9-126c911b1acf", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "vif_mac": "fa:16:3e:29:1b:91"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7dc5e9070a084dfcb543a08e87868f39", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2282f681-f2", "ovs_interfaceid": "2282f681-f2e6-4601-a493-3244e752f7a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:11:41 compute-1 nova_compute[183403]: 2026-01-26 15:11:41.261 183407 DEBUG nova.network.os_vif_util [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:29:1b:91,bridge_name='br-int',has_traffic_filtering=True,id=2282f681-f2e6-4601-a493-3244e752f7a4,network=Network(d4a37c9f-5b64-4f94-80e9-126c911b1acf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2282f681-f2') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:11:41 compute-1 nova_compute[183403]: 2026-01-26 15:11:41.261 183407 DEBUG os_vif [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:29:1b:91,bridge_name='br-int',has_traffic_filtering=True,id=2282f681-f2e6-4601-a493-3244e752f7a4,network=Network(d4a37c9f-5b64-4f94-80e9-126c911b1acf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2282f681-f2') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 26 15:11:41 compute-1 nova_compute[183403]: 2026-01-26 15:11:41.262 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:11:41 compute-1 nova_compute[183403]: 2026-01-26 15:11:41.262 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:11:41 compute-1 nova_compute[183403]: 2026-01-26 15:11:41.262 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:11:41 compute-1 nova_compute[183403]: 2026-01-26 15:11:41.263 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:11:41 compute-1 nova_compute[183403]: 2026-01-26 15:11:41.264 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '1a27d991-89fb-5673-8d4c-75cc26509d05', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:11:41 compute-1 nova_compute[183403]: 2026-01-26 15:11:41.265 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:11:41 compute-1 nova_compute[183403]: 2026-01-26 15:11:41.270 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:11:41 compute-1 nova_compute[183403]: 2026-01-26 15:11:41.280 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:11:41 compute-1 nova_compute[183403]: 2026-01-26 15:11:41.280 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2282f681-f2, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:11:41 compute-1 nova_compute[183403]: 2026-01-26 15:11:41.281 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap2282f681-f2, col_values=(('qos', UUID('8621c5dd-108a-426c-8407-84b7934469e1')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:11:41 compute-1 nova_compute[183403]: 2026-01-26 15:11:41.281 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap2282f681-f2, col_values=(('external_ids', {'iface-id': '2282f681-f2e6-4601-a493-3244e752f7a4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:29:1b:91', 'vm-uuid': '7484f2cd-93e8-4578-9c4a-5bc1e0b49d10'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:11:41 compute-1 nova_compute[183403]: 2026-01-26 15:11:41.283 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:11:41 compute-1 NetworkManager[55716]: <info>  [1769440301.2846] manager: (tap2282f681-f2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/36)
Jan 26 15:11:41 compute-1 nova_compute[183403]: 2026-01-26 15:11:41.285 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:11:41 compute-1 nova_compute[183403]: 2026-01-26 15:11:41.292 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:11:41 compute-1 nova_compute[183403]: 2026-01-26 15:11:41.294 183407 INFO os_vif [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:29:1b:91,bridge_name='br-int',has_traffic_filtering=True,id=2282f681-f2e6-4601-a493-3244e752f7a4,network=Network(d4a37c9f-5b64-4f94-80e9-126c911b1acf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2282f681-f2')
Jan 26 15:11:41 compute-1 nova_compute[183403]: 2026-01-26 15:11:41.772 183407 DEBUG oslo_concurrency.lockutils [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:11:41 compute-1 nova_compute[183403]: 2026-01-26 15:11:41.773 183407 DEBUG oslo_concurrency.lockutils [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:11:41 compute-1 nova_compute[183403]: 2026-01-26 15:11:41.773 183407 DEBUG oslo_concurrency.lockutils [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:11:41 compute-1 nova_compute[183403]: 2026-01-26 15:11:41.778 183407 INFO nova.virt.libvirt.driver [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 6532f73e-4d35-42af-b257-7ddf3cd08929] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Jan 26 15:11:41 compute-1 virtqemud[183290]: Domain id=6 name='instance-00000006' uuid=6532f73e-4d35-42af-b257-7ddf3cd08929 is tainted: custom-monitor
Jan 26 15:11:42 compute-1 nova_compute[183403]: 2026-01-26 15:11:42.785 183407 INFO nova.virt.libvirt.driver [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 6532f73e-4d35-42af-b257-7ddf3cd08929] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Jan 26 15:11:43 compute-1 nova_compute[183403]: 2026-01-26 15:11:43.034 183407 DEBUG nova.virt.libvirt.driver [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 15:11:43 compute-1 nova_compute[183403]: 2026-01-26 15:11:43.034 183407 DEBUG nova.virt.libvirt.driver [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 15:11:43 compute-1 nova_compute[183403]: 2026-01-26 15:11:43.035 183407 DEBUG nova.virt.libvirt.driver [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] No VIF found with MAC fa:16:3e:29:1b:91, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Jan 26 15:11:43 compute-1 nova_compute[183403]: 2026-01-26 15:11:43.035 183407 INFO nova.virt.libvirt.driver [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10] Using config drive
Jan 26 15:11:43 compute-1 kernel: tap2282f681-f2: entered promiscuous mode
Jan 26 15:11:43 compute-1 NetworkManager[55716]: <info>  [1769440303.1154] manager: (tap2282f681-f2): new Tun device (/org/freedesktop/NetworkManager/Devices/37)
Jan 26 15:11:43 compute-1 ovn_controller[95641]: 2026-01-26T15:11:43Z|00072|binding|INFO|Claiming lport 2282f681-f2e6-4601-a493-3244e752f7a4 for this chassis.
Jan 26 15:11:43 compute-1 ovn_controller[95641]: 2026-01-26T15:11:43Z|00073|binding|INFO|2282f681-f2e6-4601-a493-3244e752f7a4: Claiming fa:16:3e:29:1b:91 10.100.0.3
Jan 26 15:11:43 compute-1 nova_compute[183403]: 2026-01-26 15:11:43.119 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:11:43 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:11:43.127 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:29:1b:91 10.100.0.3'], port_security=['fa:16:3e:29:1b:91 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '7484f2cd-93e8-4578-9c4a-5bc1e0b49d10', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d4a37c9f-5b64-4f94-80e9-126c911b1acf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6377892a338d4a7cbe63cf30bd2c63ea', 'neutron:revision_number': '9', 'neutron:security_group_ids': '6ec487f2-f407-43f7-8fd3-02f4d5e73158', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=922d1c2a-bc46-47ee-81d5-242719303ef7, chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], logical_port=2282f681-f2e6-4601-a493-3244e752f7a4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:11:43 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:11:43.129 104930 INFO neutron.agent.ovn.metadata.agent [-] Port 2282f681-f2e6-4601-a493-3244e752f7a4 in datapath d4a37c9f-5b64-4f94-80e9-126c911b1acf bound to our chassis
Jan 26 15:11:43 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:11:43.130 104930 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d4a37c9f-5b64-4f94-80e9-126c911b1acf
Jan 26 15:11:43 compute-1 nova_compute[183403]: 2026-01-26 15:11:43.130 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:11:43 compute-1 ovn_controller[95641]: 2026-01-26T15:11:43Z|00074|binding|INFO|Setting lport 2282f681-f2e6-4601-a493-3244e752f7a4 ovn-installed in OVS
Jan 26 15:11:43 compute-1 ovn_controller[95641]: 2026-01-26T15:11:43Z|00075|binding|INFO|Setting lport 2282f681-f2e6-4601-a493-3244e752f7a4 up in Southbound
Jan 26 15:11:43 compute-1 nova_compute[183403]: 2026-01-26 15:11:43.133 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:11:43 compute-1 systemd-udevd[206487]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 15:11:43 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:11:43.157 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[bc40b373-3266-48de-9666-9d5c44972117]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:11:43 compute-1 systemd-machined[154697]: New machine qemu-7-instance-00000008.
Jan 26 15:11:43 compute-1 NetworkManager[55716]: <info>  [1769440303.1746] device (tap2282f681-f2): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 15:11:43 compute-1 NetworkManager[55716]: <info>  [1769440303.1751] device (tap2282f681-f2): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 15:11:43 compute-1 systemd[1]: Started Virtual Machine qemu-7-instance-00000008.
Jan 26 15:11:43 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:11:43.193 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[5973f4bf-36d0-4e4a-8175-2e65d8abe9dd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:11:43 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:11:43.197 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[f25ca12b-d2ab-4bfc-b2fe-bae93aae7930]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:11:43 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:11:43.226 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[45f77a1c-af1c-48b2-b918-a26ad66a6d7d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:11:43 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:11:43.248 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[a7bb6740-c870-4655-a8ae-4702c81a6b3f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd4a37c9f-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:53:55:45'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 27, 'tx_packets': 13, 'rx_bytes': 1414, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 27, 'tx_packets': 13, 'rx_bytes': 1414, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 385314, 'reachable_time': 28113, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 206500, 'error': None, 'target': 'ovnmeta-d4a37c9f-5b64-4f94-80e9-126c911b1acf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:11:43 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:11:43.266 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[442d24b5-b461-48cb-9dc7-db3a36383a1f]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd4a37c9f-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 385324, 'tstamp': 385324}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 206502, 'error': None, 'target': 'ovnmeta-d4a37c9f-5b64-4f94-80e9-126c911b1acf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd4a37c9f-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 385327, 'tstamp': 385327}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 206502, 'error': None, 'target': 'ovnmeta-d4a37c9f-5b64-4f94-80e9-126c911b1acf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:11:43 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:11:43.268 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd4a37c9f-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:11:43 compute-1 nova_compute[183403]: 2026-01-26 15:11:43.269 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:11:43 compute-1 nova_compute[183403]: 2026-01-26 15:11:43.270 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:11:43 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:11:43.271 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd4a37c9f-50, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:11:43 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:11:43.272 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:11:43 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:11:43.272 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd4a37c9f-50, col_values=(('external_ids', {'iface-id': '3415b7f1-5b64-48d1-b20f-4c68422efc0e'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:11:43 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:11:43.272 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:11:43 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:11:43.274 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[df9c3787-c636-42f4-b5aa-dc76504d760b]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-d4a37c9f-5b64-4f94-80e9-126c911b1acf\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/d4a37c9f-5b64-4f94-80e9-126c911b1acf.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID d4a37c9f-5b64-4f94-80e9-126c911b1acf\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:11:43 compute-1 nova_compute[183403]: 2026-01-26 15:11:43.524 183407 DEBUG nova.compute.manager [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 26 15:11:43 compute-1 nova_compute[183403]: 2026-01-26 15:11:43.528 183407 INFO nova.virt.libvirt.driver [-] [instance: 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10] Instance running successfully.
Jan 26 15:11:43 compute-1 virtqemud[183290]: argument unsupported: QEMU guest agent is not configured
Jan 26 15:11:43 compute-1 nova_compute[183403]: 2026-01-26 15:11:43.529 183407 DEBUG nova.virt.libvirt.guest [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:200
Jan 26 15:11:43 compute-1 nova_compute[183403]: 2026-01-26 15:11:43.530 183407 DEBUG nova.virt.libvirt.driver [None req-c6edb81b-d82c-48a2-b711-362638447ebe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10] finish_migration finished successfully. finish_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12699
Jan 26 15:11:43 compute-1 nova_compute[183403]: 2026-01-26 15:11:43.792 183407 INFO nova.virt.libvirt.driver [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 6532f73e-4d35-42af-b257-7ddf3cd08929] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Jan 26 15:11:43 compute-1 nova_compute[183403]: 2026-01-26 15:11:43.797 183407 DEBUG nova.compute.manager [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 6532f73e-4d35-42af-b257-7ddf3cd08929] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Jan 26 15:11:44 compute-1 nova_compute[183403]: 2026-01-26 15:11:44.372 183407 DEBUG nova.objects.instance [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 6532f73e-4d35-42af-b257-7ddf3cd08929] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Jan 26 15:11:44 compute-1 nova_compute[183403]: 2026-01-26 15:11:44.455 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:11:45 compute-1 nova_compute[183403]: 2026-01-26 15:11:45.264 183407 DEBUG nova.compute.manager [req-a59d0922-5d19-4fd1-915c-e08111b45511 req-77b4e6d6-1ba0-41e3-aa0d-b18e609106c2 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10] Received event network-vif-plugged-2282f681-f2e6-4601-a493-3244e752f7a4 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:11:45 compute-1 nova_compute[183403]: 2026-01-26 15:11:45.265 183407 DEBUG oslo_concurrency.lockutils [req-a59d0922-5d19-4fd1-915c-e08111b45511 req-77b4e6d6-1ba0-41e3-aa0d-b18e609106c2 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "7484f2cd-93e8-4578-9c4a-5bc1e0b49d10-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:11:45 compute-1 nova_compute[183403]: 2026-01-26 15:11:45.265 183407 DEBUG oslo_concurrency.lockutils [req-a59d0922-5d19-4fd1-915c-e08111b45511 req-77b4e6d6-1ba0-41e3-aa0d-b18e609106c2 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "7484f2cd-93e8-4578-9c4a-5bc1e0b49d10-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:11:45 compute-1 nova_compute[183403]: 2026-01-26 15:11:45.265 183407 DEBUG oslo_concurrency.lockutils [req-a59d0922-5d19-4fd1-915c-e08111b45511 req-77b4e6d6-1ba0-41e3-aa0d-b18e609106c2 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "7484f2cd-93e8-4578-9c4a-5bc1e0b49d10-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:11:45 compute-1 nova_compute[183403]: 2026-01-26 15:11:45.266 183407 DEBUG nova.compute.manager [req-a59d0922-5d19-4fd1-915c-e08111b45511 req-77b4e6d6-1ba0-41e3-aa0d-b18e609106c2 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10] No waiting events found dispatching network-vif-plugged-2282f681-f2e6-4601-a493-3244e752f7a4 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:11:45 compute-1 nova_compute[183403]: 2026-01-26 15:11:45.266 183407 WARNING nova.compute.manager [req-a59d0922-5d19-4fd1-915c-e08111b45511 req-77b4e6d6-1ba0-41e3-aa0d-b18e609106c2 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10] Received unexpected event network-vif-plugged-2282f681-f2e6-4601-a493-3244e752f7a4 for instance with vm_state resized and task_state None.
Jan 26 15:11:45 compute-1 nova_compute[183403]: 2026-01-26 15:11:45.391 183407 WARNING neutronclient.v2_0.client [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:11:45 compute-1 nova_compute[183403]: 2026-01-26 15:11:45.505 183407 WARNING neutronclient.v2_0.client [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:11:45 compute-1 nova_compute[183403]: 2026-01-26 15:11:45.505 183407 WARNING neutronclient.v2_0.client [None req-d9f480cb-8966-4c0d-a2e3-0c2dc339ec1b a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:11:46 compute-1 nova_compute[183403]: 2026-01-26 15:11:46.285 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:11:47 compute-1 nova_compute[183403]: 2026-01-26 15:11:47.585 183407 DEBUG nova.compute.manager [req-102d422c-18f0-46fb-9696-9c162a130282 req-85d4851a-cdbf-4791-aad6-d22b7b405f2a 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10] Received event network-vif-plugged-2282f681-f2e6-4601-a493-3244e752f7a4 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:11:47 compute-1 nova_compute[183403]: 2026-01-26 15:11:47.585 183407 DEBUG oslo_concurrency.lockutils [req-102d422c-18f0-46fb-9696-9c162a130282 req-85d4851a-cdbf-4791-aad6-d22b7b405f2a 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "7484f2cd-93e8-4578-9c4a-5bc1e0b49d10-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:11:47 compute-1 nova_compute[183403]: 2026-01-26 15:11:47.585 183407 DEBUG oslo_concurrency.lockutils [req-102d422c-18f0-46fb-9696-9c162a130282 req-85d4851a-cdbf-4791-aad6-d22b7b405f2a 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "7484f2cd-93e8-4578-9c4a-5bc1e0b49d10-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:11:47 compute-1 nova_compute[183403]: 2026-01-26 15:11:47.586 183407 DEBUG oslo_concurrency.lockutils [req-102d422c-18f0-46fb-9696-9c162a130282 req-85d4851a-cdbf-4791-aad6-d22b7b405f2a 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "7484f2cd-93e8-4578-9c4a-5bc1e0b49d10-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:11:47 compute-1 nova_compute[183403]: 2026-01-26 15:11:47.586 183407 DEBUG nova.compute.manager [req-102d422c-18f0-46fb-9696-9c162a130282 req-85d4851a-cdbf-4791-aad6-d22b7b405f2a 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10] No waiting events found dispatching network-vif-plugged-2282f681-f2e6-4601-a493-3244e752f7a4 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:11:47 compute-1 nova_compute[183403]: 2026-01-26 15:11:47.586 183407 WARNING nova.compute.manager [req-102d422c-18f0-46fb-9696-9c162a130282 req-85d4851a-cdbf-4791-aad6-d22b7b405f2a 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10] Received unexpected event network-vif-plugged-2282f681-f2e6-4601-a493-3244e752f7a4 for instance with vm_state resized and task_state None.
Jan 26 15:11:47 compute-1 sshd-session[206458]: Invalid user github from 185.246.128.170 port 40860
Jan 26 15:11:48 compute-1 sshd-session[206458]: Disconnecting invalid user github 185.246.128.170 port 40860: Change of username or service not allowed: (github,ssh-connection) -> (elasticsearch,ssh-connection) [preauth]
Jan 26 15:11:49 compute-1 openstack_network_exporter[195610]: ERROR   15:11:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:11:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:11:49 compute-1 openstack_network_exporter[195610]: ERROR   15:11:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:11:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:11:49 compute-1 nova_compute[183403]: 2026-01-26 15:11:49.457 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:11:50 compute-1 podman[206511]: 2026-01-26 15:11:50.897443637 +0000 UTC m=+0.067611270 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 15:11:50 compute-1 podman[206512]: 2026-01-26 15:11:50.909514667 +0000 UTC m=+0.078021539 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, architecture=x86_64, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.openshift.expose-services=, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.component=ubi9-minimal-container, distribution-scope=public)
Jan 26 15:11:51 compute-1 nova_compute[183403]: 2026-01-26 15:11:51.289 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:11:54 compute-1 nova_compute[183403]: 2026-01-26 15:11:54.460 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:11:54 compute-1 sshd-session[206554]: Invalid user elasticsearch from 185.246.128.170 port 38496
Jan 26 15:11:56 compute-1 ovn_controller[95641]: 2026-01-26T15:11:56Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:29:1b:91 10.100.0.3
Jan 26 15:11:56 compute-1 nova_compute[183403]: 2026-01-26 15:11:56.294 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:11:59 compute-1 sshd-session[206554]: Disconnecting invalid user elasticsearch 185.246.128.170 port 38496: Change of username or service not allowed: (elasticsearch,ssh-connection) -> (super,ssh-connection) [preauth]
Jan 26 15:11:59 compute-1 nova_compute[183403]: 2026-01-26 15:11:59.461 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:00 compute-1 nova_compute[183403]: 2026-01-26 15:12:00.083 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:12:00 compute-1 nova_compute[183403]: 2026-01-26 15:12:00.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:12:01 compute-1 nova_compute[183403]: 2026-01-26 15:12:01.093 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:12:01 compute-1 nova_compute[183403]: 2026-01-26 15:12:01.094 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:12:01 compute-1 nova_compute[183403]: 2026-01-26 15:12:01.094 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:12:01 compute-1 nova_compute[183403]: 2026-01-26 15:12:01.095 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:12:01 compute-1 podman[206569]: 2026-01-26 15:12:01.219842117 +0000 UTC m=+0.066617796 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 26 15:12:01 compute-1 podman[206568]: 2026-01-26 15:12:01.239502114 +0000 UTC m=+0.094493079 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.build-date=20260120)
Jan 26 15:12:01 compute-1 nova_compute[183403]: 2026-01-26 15:12:01.296 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:02 compute-1 nova_compute[183403]: 2026-01-26 15:12:02.158 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7484f2cd-93e8-4578-9c4a-5bc1e0b49d10/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:12:02 compute-1 nova_compute[183403]: 2026-01-26 15:12:02.234 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7484f2cd-93e8-4578-9c4a-5bc1e0b49d10/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:12:02 compute-1 nova_compute[183403]: 2026-01-26 15:12:02.237 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7484f2cd-93e8-4578-9c4a-5bc1e0b49d10/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:12:02 compute-1 nova_compute[183403]: 2026-01-26 15:12:02.305 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7484f2cd-93e8-4578-9c4a-5bc1e0b49d10/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:12:02 compute-1 nova_compute[183403]: 2026-01-26 15:12:02.311 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/981e6db3-c4e9-422b-91bb-a2c1c5869fc4/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:12:02 compute-1 nova_compute[183403]: 2026-01-26 15:12:02.383 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/981e6db3-c4e9-422b-91bb-a2c1c5869fc4/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:12:02 compute-1 nova_compute[183403]: 2026-01-26 15:12:02.384 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/981e6db3-c4e9-422b-91bb-a2c1c5869fc4/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:12:02 compute-1 nova_compute[183403]: 2026-01-26 15:12:02.440 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/981e6db3-c4e9-422b-91bb-a2c1c5869fc4/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:12:02 compute-1 nova_compute[183403]: 2026-01-26 15:12:02.446 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/66a7af21-1abe-467f-b739-441e05a4b09a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:12:02 compute-1 nova_compute[183403]: 2026-01-26 15:12:02.512 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/66a7af21-1abe-467f-b739-441e05a4b09a/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:12:02 compute-1 nova_compute[183403]: 2026-01-26 15:12:02.513 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/66a7af21-1abe-467f-b739-441e05a4b09a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:12:02 compute-1 nova_compute[183403]: 2026-01-26 15:12:02.576 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/66a7af21-1abe-467f-b739-441e05a4b09a/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:12:02 compute-1 nova_compute[183403]: 2026-01-26 15:12:02.582 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/aab8c28e-0489-40bd-88cf-5eb7c419933a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:12:02 compute-1 nova_compute[183403]: 2026-01-26 15:12:02.640 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/aab8c28e-0489-40bd-88cf-5eb7c419933a/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:12:02 compute-1 nova_compute[183403]: 2026-01-26 15:12:02.641 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/aab8c28e-0489-40bd-88cf-5eb7c419933a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:12:02 compute-1 nova_compute[183403]: 2026-01-26 15:12:02.695 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/aab8c28e-0489-40bd-88cf-5eb7c419933a/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:12:02 compute-1 nova_compute[183403]: 2026-01-26 15:12:02.701 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8c64a2e0-f723-4adb-84fc-867073a92349/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:12:02 compute-1 nova_compute[183403]: 2026-01-26 15:12:02.757 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8c64a2e0-f723-4adb-84fc-867073a92349/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:12:02 compute-1 nova_compute[183403]: 2026-01-26 15:12:02.758 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8c64a2e0-f723-4adb-84fc-867073a92349/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:12:02 compute-1 nova_compute[183403]: 2026-01-26 15:12:02.807 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8c64a2e0-f723-4adb-84fc-867073a92349/disk --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:12:02 compute-1 nova_compute[183403]: 2026-01-26 15:12:02.812 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6532f73e-4d35-42af-b257-7ddf3cd08929/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:12:02 compute-1 nova_compute[183403]: 2026-01-26 15:12:02.867 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6532f73e-4d35-42af-b257-7ddf3cd08929/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:12:02 compute-1 nova_compute[183403]: 2026-01-26 15:12:02.868 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6532f73e-4d35-42af-b257-7ddf3cd08929/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:12:02 compute-1 nova_compute[183403]: 2026-01-26 15:12:02.953 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6532f73e-4d35-42af-b257-7ddf3cd08929/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:12:03 compute-1 nova_compute[183403]: 2026-01-26 15:12:03.181 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:12:03 compute-1 nova_compute[183403]: 2026-01-26 15:12:03.182 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:12:03 compute-1 nova_compute[183403]: 2026-01-26 15:12:03.203 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.021s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:12:03 compute-1 nova_compute[183403]: 2026-01-26 15:12:03.204 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4937MB free_disk=72.97610473632812GB free_vcpus=2 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:12:03 compute-1 nova_compute[183403]: 2026-01-26 15:12:03.204 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:12:03 compute-1 nova_compute[183403]: 2026-01-26 15:12:03.205 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:12:03 compute-1 sshd-session[206564]: Invalid user super from 185.246.128.170 port 47376
Jan 26 15:12:04 compute-1 nova_compute[183403]: 2026-01-26 15:12:04.464 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:04 compute-1 nova_compute[183403]: 2026-01-26 15:12:04.972 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Instance 66a7af21-1abe-467f-b739-441e05a4b09a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 15:12:04 compute-1 nova_compute[183403]: 2026-01-26 15:12:04.973 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Instance 8c64a2e0-f723-4adb-84fc-867073a92349 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 15:12:04 compute-1 nova_compute[183403]: 2026-01-26 15:12:04.973 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Instance aab8c28e-0489-40bd-88cf-5eb7c419933a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 15:12:04 compute-1 nova_compute[183403]: 2026-01-26 15:12:04.973 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Instance 981e6db3-c4e9-422b-91bb-a2c1c5869fc4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 15:12:04 compute-1 nova_compute[183403]: 2026-01-26 15:12:04.973 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Instance 6532f73e-4d35-42af-b257-7ddf3cd08929 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 15:12:04 compute-1 nova_compute[183403]: 2026-01-26 15:12:04.973 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Instance 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 15:12:04 compute-1 nova_compute[183403]: 2026-01-26 15:12:04.973 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 6 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:12:04 compute-1 nova_compute[183403]: 2026-01-26 15:12:04.973 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=1344MB phys_disk=79GB used_disk=6GB total_vcpus=8 used_vcpus=6 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:12:03 up  1:07,  0 user,  load average: 0.48, 0.29, 0.38\n', 'num_instances': '6', 'num_vm_active': '6', 'num_task_None': '6', 'num_os_type_None': '6', 'num_proj_6377892a338d4a7cbe63cf30bd2c63ea': '6', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:12:05 compute-1 nova_compute[183403]: 2026-01-26 15:12:05.017 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Refreshing inventories for resource provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Jan 26 15:12:05 compute-1 nova_compute[183403]: 2026-01-26 15:12:05.060 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Updating ProviderTree inventory for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Jan 26 15:12:05 compute-1 nova_compute[183403]: 2026-01-26 15:12:05.060 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Updating inventory in ProviderTree for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Jan 26 15:12:05 compute-1 nova_compute[183403]: 2026-01-26 15:12:05.073 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Refreshing aggregate associations for resource provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Jan 26 15:12:05 compute-1 nova_compute[183403]: 2026-01-26 15:12:05.096 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Refreshing trait associations for resource provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67, traits: COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE41,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NODE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_MMX,COMPUTE_ACCELERATORS,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_ARCH_X86_64,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE2,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_SOUND_MODEL_USB,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_TIS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_DEVICE_TAGGING,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_CRB,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_SOUND_MODEL_SB16,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_QCOW2,HW_ARCH_X86_64,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SECURITY_TPM_2_0 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Jan 26 15:12:05 compute-1 nova_compute[183403]: 2026-01-26 15:12:05.202 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:12:05 compute-1 podman[192725]: time="2026-01-26T15:12:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:12:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:12:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16574 "" "Go-http-client/1.1"
Jan 26 15:12:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:12:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2650 "" "Go-http-client/1.1"
Jan 26 15:12:05 compute-1 nova_compute[183403]: 2026-01-26 15:12:05.709 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:12:05 compute-1 nova_compute[183403]: 2026-01-26 15:12:05.797 183407 DEBUG oslo_concurrency.lockutils [None req-07238e0c-fe0b-4b8b-a8f7-b8cd795e128d afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Acquiring lock "981e6db3-c4e9-422b-91bb-a2c1c5869fc4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:12:05 compute-1 nova_compute[183403]: 2026-01-26 15:12:05.798 183407 DEBUG oslo_concurrency.lockutils [None req-07238e0c-fe0b-4b8b-a8f7-b8cd795e128d afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "981e6db3-c4e9-422b-91bb-a2c1c5869fc4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:12:05 compute-1 nova_compute[183403]: 2026-01-26 15:12:05.798 183407 DEBUG oslo_concurrency.lockutils [None req-07238e0c-fe0b-4b8b-a8f7-b8cd795e128d afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Acquiring lock "981e6db3-c4e9-422b-91bb-a2c1c5869fc4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:12:05 compute-1 nova_compute[183403]: 2026-01-26 15:12:05.799 183407 DEBUG oslo_concurrency.lockutils [None req-07238e0c-fe0b-4b8b-a8f7-b8cd795e128d afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "981e6db3-c4e9-422b-91bb-a2c1c5869fc4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:12:05 compute-1 nova_compute[183403]: 2026-01-26 15:12:05.799 183407 DEBUG oslo_concurrency.lockutils [None req-07238e0c-fe0b-4b8b-a8f7-b8cd795e128d afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "981e6db3-c4e9-422b-91bb-a2c1c5869fc4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:12:05 compute-1 nova_compute[183403]: 2026-01-26 15:12:05.856 183407 INFO nova.compute.manager [None req-07238e0c-fe0b-4b8b-a8f7-b8cd795e128d afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Terminating instance
Jan 26 15:12:06 compute-1 sshd-session[206564]: Disconnecting invalid user super 185.246.128.170 port 47376: Change of username or service not allowed: (super,ssh-connection) -> (victor,ssh-connection) [preauth]
Jan 26 15:12:06 compute-1 nova_compute[183403]: 2026-01-26 15:12:06.220 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:12:06 compute-1 nova_compute[183403]: 2026-01-26 15:12:06.221 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.016s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:12:06 compute-1 nova_compute[183403]: 2026-01-26 15:12:06.299 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:06 compute-1 nova_compute[183403]: 2026-01-26 15:12:06.389 183407 DEBUG nova.compute.manager [None req-07238e0c-fe0b-4b8b-a8f7-b8cd795e128d afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Jan 26 15:12:06 compute-1 kernel: tapb50fc69b-cf (unregistering): left promiscuous mode
Jan 26 15:12:06 compute-1 NetworkManager[55716]: <info>  [1769440326.5360] device (tapb50fc69b-cf): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 15:12:06 compute-1 nova_compute[183403]: 2026-01-26 15:12:06.544 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:06 compute-1 ovn_controller[95641]: 2026-01-26T15:12:06Z|00076|binding|INFO|Releasing lport b50fc69b-cfde-429d-908f-cde6f56037bf from this chassis (sb_readonly=0)
Jan 26 15:12:06 compute-1 ovn_controller[95641]: 2026-01-26T15:12:06Z|00077|binding|INFO|Setting lport b50fc69b-cfde-429d-908f-cde6f56037bf down in Southbound
Jan 26 15:12:06 compute-1 ovn_controller[95641]: 2026-01-26T15:12:06Z|00078|binding|INFO|Removing iface tapb50fc69b-cf ovn-installed in OVS
Jan 26 15:12:06 compute-1 nova_compute[183403]: 2026-01-26 15:12:06.547 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:06.556 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7f:ac:22 10.100.0.10'], port_security=['fa:16:3e:7f:ac:22 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '981e6db3-c4e9-422b-91bb-a2c1c5869fc4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d4a37c9f-5b64-4f94-80e9-126c911b1acf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6377892a338d4a7cbe63cf30bd2c63ea', 'neutron:revision_number': '5', 'neutron:security_group_ids': '6ec487f2-f407-43f7-8fd3-02f4d5e73158', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=922d1c2a-bc46-47ee-81d5-242719303ef7, chassis=[], tunnel_key=8, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], logical_port=b50fc69b-cfde-429d-908f-cde6f56037bf) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:12:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:06.557 104930 INFO neutron.agent.ovn.metadata.agent [-] Port b50fc69b-cfde-429d-908f-cde6f56037bf in datapath d4a37c9f-5b64-4f94-80e9-126c911b1acf unbound from our chassis
Jan 26 15:12:06 compute-1 nova_compute[183403]: 2026-01-26 15:12:06.559 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:06.559 104930 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d4a37c9f-5b64-4f94-80e9-126c911b1acf
Jan 26 15:12:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:06.585 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[7c1ddf4a-6ae9-4c82-b6cf-d503255d5b75]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:12:06 compute-1 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000009.scope: Deactivated successfully.
Jan 26 15:12:06 compute-1 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000009.scope: Consumed 16.668s CPU time.
Jan 26 15:12:06 compute-1 systemd-machined[154697]: Machine qemu-5-instance-00000009 terminated.
Jan 26 15:12:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:06.633 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[6a6a22c8-03fe-4e77-b4e8-1a92cdc69d5a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:12:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:06.636 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[c15ec22a-9b17-4a79-9a4e-57834991a5a9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:12:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:06.681 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[7cf8908d-a362-4b60-8f53-6763d3b278d5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:12:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:06.697 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[b127bed1-639a-41be-8697-0d116e54050e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd4a37c9f-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:53:55:45'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 36, 'tx_packets': 15, 'rx_bytes': 1792, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 36, 'tx_packets': 15, 'rx_bytes': 1792, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 385314, 'reachable_time': 28113, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 206673, 'error': None, 'target': 'ovnmeta-d4a37c9f-5b64-4f94-80e9-126c911b1acf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:12:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:06.712 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[c5804edb-83e7-4706-a0df-76e97f22b1a5]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd4a37c9f-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 385324, 'tstamp': 385324}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 206674, 'error': None, 'target': 'ovnmeta-d4a37c9f-5b64-4f94-80e9-126c911b1acf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd4a37c9f-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 385327, 'tstamp': 385327}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 206674, 'error': None, 'target': 'ovnmeta-d4a37c9f-5b64-4f94-80e9-126c911b1acf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:12:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:06.714 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd4a37c9f-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:12:06 compute-1 nova_compute[183403]: 2026-01-26 15:12:06.729 183407 DEBUG nova.compute.manager [req-74da1b2f-5c9e-48b5-a0fe-8be7d0025a47 req-292cf748-0e5a-4660-a863-f504c5eda4d9 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Received event network-vif-unplugged-b50fc69b-cfde-429d-908f-cde6f56037bf external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:12:06 compute-1 nova_compute[183403]: 2026-01-26 15:12:06.729 183407 DEBUG oslo_concurrency.lockutils [req-74da1b2f-5c9e-48b5-a0fe-8be7d0025a47 req-292cf748-0e5a-4660-a863-f504c5eda4d9 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "981e6db3-c4e9-422b-91bb-a2c1c5869fc4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:12:06 compute-1 nova_compute[183403]: 2026-01-26 15:12:06.729 183407 DEBUG oslo_concurrency.lockutils [req-74da1b2f-5c9e-48b5-a0fe-8be7d0025a47 req-292cf748-0e5a-4660-a863-f504c5eda4d9 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "981e6db3-c4e9-422b-91bb-a2c1c5869fc4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:12:06 compute-1 nova_compute[183403]: 2026-01-26 15:12:06.729 183407 DEBUG oslo_concurrency.lockutils [req-74da1b2f-5c9e-48b5-a0fe-8be7d0025a47 req-292cf748-0e5a-4660-a863-f504c5eda4d9 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "981e6db3-c4e9-422b-91bb-a2c1c5869fc4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:12:06 compute-1 nova_compute[183403]: 2026-01-26 15:12:06.730 183407 DEBUG nova.compute.manager [req-74da1b2f-5c9e-48b5-a0fe-8be7d0025a47 req-292cf748-0e5a-4660-a863-f504c5eda4d9 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] No waiting events found dispatching network-vif-unplugged-b50fc69b-cfde-429d-908f-cde6f56037bf pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:12:06 compute-1 nova_compute[183403]: 2026-01-26 15:12:06.730 183407 DEBUG nova.compute.manager [req-74da1b2f-5c9e-48b5-a0fe-8be7d0025a47 req-292cf748-0e5a-4660-a863-f504c5eda4d9 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Received event network-vif-unplugged-b50fc69b-cfde-429d-908f-cde6f56037bf for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 15:12:06 compute-1 nova_compute[183403]: 2026-01-26 15:12:06.751 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:06 compute-1 nova_compute[183403]: 2026-01-26 15:12:06.756 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:06.756 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd4a37c9f-50, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:12:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:06.757 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:12:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:06.757 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd4a37c9f-50, col_values=(('external_ids', {'iface-id': '3415b7f1-5b64-48d1-b20f-4c68422efc0e'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:12:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:06.758 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:12:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:06.759 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[2621c347-e297-4a9d-9106-1a5e99b1f8ad]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-d4a37c9f-5b64-4f94-80e9-126c911b1acf\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/d4a37c9f-5b64-4f94-80e9-126c911b1acf.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID d4a37c9f-5b64-4f94-80e9-126c911b1acf\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:12:06 compute-1 nova_compute[183403]: 2026-01-26 15:12:06.809 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:06 compute-1 nova_compute[183403]: 2026-01-26 15:12:06.817 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:06 compute-1 nova_compute[183403]: 2026-01-26 15:12:06.868 183407 INFO nova.virt.libvirt.driver [-] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Instance destroyed successfully.
Jan 26 15:12:06 compute-1 nova_compute[183403]: 2026-01-26 15:12:06.868 183407 DEBUG nova.objects.instance [None req-07238e0c-fe0b-4b8b-a8f7-b8cd795e128d afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lazy-loading 'resources' on Instance uuid 981e6db3-c4e9-422b-91bb-a2c1c5869fc4 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:12:07 compute-1 nova_compute[183403]: 2026-01-26 15:12:07.595 183407 DEBUG nova.virt.libvirt.vif [None req-07238e0c-fe0b-4b8b-a8f7-b8cd795e128d afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2026-01-26T15:10:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-855077368',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-855077368',id=9,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T15:11:02Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6377892a338d4a7cbe63cf30bd2c63ea',ramdisk_id='',reservation_id='r-mka0m05y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,reader,member,admin',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-280856547',owner_user_name='tempest-TestExecuteActionsViaActuator-280856547-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T15:11:02Z,user_data=None,user_id='afb4f4811cb043dca89a8413c390ba3d',uuid=981e6db3-c4e9-422b-91bb-a2c1c5869fc4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b50fc69b-cfde-429d-908f-cde6f56037bf", "address": "fa:16:3e:7f:ac:22", "network": {"id": "d4a37c9f-5b64-4f94-80e9-126c911b1acf", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7dc5e9070a084dfcb543a08e87868f39", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb50fc69b-cf", "ovs_interfaceid": "b50fc69b-cfde-429d-908f-cde6f56037bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 26 15:12:07 compute-1 nova_compute[183403]: 2026-01-26 15:12:07.595 183407 DEBUG nova.network.os_vif_util [None req-07238e0c-fe0b-4b8b-a8f7-b8cd795e128d afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Converting VIF {"id": "b50fc69b-cfde-429d-908f-cde6f56037bf", "address": "fa:16:3e:7f:ac:22", "network": {"id": "d4a37c9f-5b64-4f94-80e9-126c911b1acf", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7dc5e9070a084dfcb543a08e87868f39", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb50fc69b-cf", "ovs_interfaceid": "b50fc69b-cfde-429d-908f-cde6f56037bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:12:07 compute-1 nova_compute[183403]: 2026-01-26 15:12:07.596 183407 DEBUG nova.network.os_vif_util [None req-07238e0c-fe0b-4b8b-a8f7-b8cd795e128d afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7f:ac:22,bridge_name='br-int',has_traffic_filtering=True,id=b50fc69b-cfde-429d-908f-cde6f56037bf,network=Network(d4a37c9f-5b64-4f94-80e9-126c911b1acf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb50fc69b-cf') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:12:07 compute-1 nova_compute[183403]: 2026-01-26 15:12:07.596 183407 DEBUG os_vif [None req-07238e0c-fe0b-4b8b-a8f7-b8cd795e128d afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7f:ac:22,bridge_name='br-int',has_traffic_filtering=True,id=b50fc69b-cfde-429d-908f-cde6f56037bf,network=Network(d4a37c9f-5b64-4f94-80e9-126c911b1acf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb50fc69b-cf') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 26 15:12:07 compute-1 nova_compute[183403]: 2026-01-26 15:12:07.598 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:07 compute-1 nova_compute[183403]: 2026-01-26 15:12:07.598 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb50fc69b-cf, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:12:07 compute-1 nova_compute[183403]: 2026-01-26 15:12:07.600 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:07 compute-1 nova_compute[183403]: 2026-01-26 15:12:07.601 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:07 compute-1 nova_compute[183403]: 2026-01-26 15:12:07.602 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:07 compute-1 nova_compute[183403]: 2026-01-26 15:12:07.602 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=fb96a913-e619-49d9-8787-9a482bf4ae58) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:12:07 compute-1 nova_compute[183403]: 2026-01-26 15:12:07.603 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:07 compute-1 nova_compute[183403]: 2026-01-26 15:12:07.603 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:07 compute-1 nova_compute[183403]: 2026-01-26 15:12:07.606 183407 INFO os_vif [None req-07238e0c-fe0b-4b8b-a8f7-b8cd795e128d afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7f:ac:22,bridge_name='br-int',has_traffic_filtering=True,id=b50fc69b-cfde-429d-908f-cde6f56037bf,network=Network(d4a37c9f-5b64-4f94-80e9-126c911b1acf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb50fc69b-cf')
Jan 26 15:12:07 compute-1 nova_compute[183403]: 2026-01-26 15:12:07.607 183407 INFO nova.virt.libvirt.driver [None req-07238e0c-fe0b-4b8b-a8f7-b8cd795e128d afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Deleting instance files /var/lib/nova/instances/981e6db3-c4e9-422b-91bb-a2c1c5869fc4_del
Jan 26 15:12:07 compute-1 nova_compute[183403]: 2026-01-26 15:12:07.607 183407 INFO nova.virt.libvirt.driver [None req-07238e0c-fe0b-4b8b-a8f7-b8cd795e128d afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Deletion of /var/lib/nova/instances/981e6db3-c4e9-422b-91bb-a2c1c5869fc4_del complete
Jan 26 15:12:08 compute-1 nova_compute[183403]: 2026-01-26 15:12:08.120 183407 INFO nova.compute.manager [None req-07238e0c-fe0b-4b8b-a8f7-b8cd795e128d afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Took 1.73 seconds to destroy the instance on the hypervisor.
Jan 26 15:12:08 compute-1 nova_compute[183403]: 2026-01-26 15:12:08.120 183407 DEBUG oslo.service.backend._eventlet.loopingcall [None req-07238e0c-fe0b-4b8b-a8f7-b8cd795e128d afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Jan 26 15:12:08 compute-1 nova_compute[183403]: 2026-01-26 15:12:08.120 183407 DEBUG nova.compute.manager [-] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Jan 26 15:12:08 compute-1 nova_compute[183403]: 2026-01-26 15:12:08.120 183407 DEBUG nova.network.neutron [-] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Jan 26 15:12:08 compute-1 nova_compute[183403]: 2026-01-26 15:12:08.121 183407 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:12:08 compute-1 nova_compute[183403]: 2026-01-26 15:12:08.237 183407 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:12:08 compute-1 nova_compute[183403]: 2026-01-26 15:12:08.540 183407 DEBUG nova.compute.manager [req-d45ca1f6-3d99-4be9-9559-ada33892e332 req-d343c170-c47e-49e4-94b0-382d443d1277 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Received event network-vif-deleted-b50fc69b-cfde-429d-908f-cde6f56037bf external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:12:08 compute-1 nova_compute[183403]: 2026-01-26 15:12:08.540 183407 INFO nova.compute.manager [req-d45ca1f6-3d99-4be9-9559-ada33892e332 req-d343c170-c47e-49e4-94b0-382d443d1277 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Neutron deleted interface b50fc69b-cfde-429d-908f-cde6f56037bf; detaching it from the instance and deleting it from the info cache
Jan 26 15:12:08 compute-1 nova_compute[183403]: 2026-01-26 15:12:08.541 183407 DEBUG nova.network.neutron [req-d45ca1f6-3d99-4be9-9559-ada33892e332 req-d343c170-c47e-49e4-94b0-382d443d1277 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:12:08 compute-1 nova_compute[183403]: 2026-01-26 15:12:08.793 183407 DEBUG nova.compute.manager [req-a8743d18-ecc9-4c1c-939d-3110116d57a5 req-ab0dda59-4a96-4350-92c4-ef243176f593 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Received event network-vif-unplugged-b50fc69b-cfde-429d-908f-cde6f56037bf external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:12:08 compute-1 nova_compute[183403]: 2026-01-26 15:12:08.794 183407 DEBUG oslo_concurrency.lockutils [req-a8743d18-ecc9-4c1c-939d-3110116d57a5 req-ab0dda59-4a96-4350-92c4-ef243176f593 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "981e6db3-c4e9-422b-91bb-a2c1c5869fc4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:12:08 compute-1 nova_compute[183403]: 2026-01-26 15:12:08.794 183407 DEBUG oslo_concurrency.lockutils [req-a8743d18-ecc9-4c1c-939d-3110116d57a5 req-ab0dda59-4a96-4350-92c4-ef243176f593 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "981e6db3-c4e9-422b-91bb-a2c1c5869fc4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:12:08 compute-1 nova_compute[183403]: 2026-01-26 15:12:08.795 183407 DEBUG oslo_concurrency.lockutils [req-a8743d18-ecc9-4c1c-939d-3110116d57a5 req-ab0dda59-4a96-4350-92c4-ef243176f593 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "981e6db3-c4e9-422b-91bb-a2c1c5869fc4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:12:08 compute-1 nova_compute[183403]: 2026-01-26 15:12:08.795 183407 DEBUG nova.compute.manager [req-a8743d18-ecc9-4c1c-939d-3110116d57a5 req-ab0dda59-4a96-4350-92c4-ef243176f593 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] No waiting events found dispatching network-vif-unplugged-b50fc69b-cfde-429d-908f-cde6f56037bf pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:12:08 compute-1 nova_compute[183403]: 2026-01-26 15:12:08.795 183407 DEBUG nova.compute.manager [req-a8743d18-ecc9-4c1c-939d-3110116d57a5 req-ab0dda59-4a96-4350-92c4-ef243176f593 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Received event network-vif-unplugged-b50fc69b-cfde-429d-908f-cde6f56037bf for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 15:12:08 compute-1 nova_compute[183403]: 2026-01-26 15:12:08.986 183407 DEBUG nova.network.neutron [-] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:12:09 compute-1 nova_compute[183403]: 2026-01-26 15:12:09.050 183407 DEBUG nova.compute.manager [req-d45ca1f6-3d99-4be9-9559-ada33892e332 req-d343c170-c47e-49e4-94b0-382d443d1277 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Detach interface failed, port_id=b50fc69b-cfde-429d-908f-cde6f56037bf, reason: Instance 981e6db3-c4e9-422b-91bb-a2c1c5869fc4 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Jan 26 15:12:09 compute-1 nova_compute[183403]: 2026-01-26 15:12:09.221 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:12:09 compute-1 nova_compute[183403]: 2026-01-26 15:12:09.467 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:09 compute-1 nova_compute[183403]: 2026-01-26 15:12:09.634 183407 INFO nova.compute.manager [-] [instance: 981e6db3-c4e9-422b-91bb-a2c1c5869fc4] Took 1.51 seconds to deallocate network for instance.
Jan 26 15:12:09 compute-1 nova_compute[183403]: 2026-01-26 15:12:09.736 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:12:09 compute-1 nova_compute[183403]: 2026-01-26 15:12:09.736 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:12:09 compute-1 nova_compute[183403]: 2026-01-26 15:12:09.737 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:12:09 compute-1 nova_compute[183403]: 2026-01-26 15:12:09.737 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:12:09 compute-1 nova_compute[183403]: 2026-01-26 15:12:09.737 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:12:09 compute-1 nova_compute[183403]: 2026-01-26 15:12:09.737 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 15:12:11 compute-1 nova_compute[183403]: 2026-01-26 15:12:11.088 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:12:12 compute-1 nova_compute[183403]: 2026-01-26 15:12:12.604 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:13 compute-1 sshd-session[206692]: Invalid user victor from 185.246.128.170 port 11481
Jan 26 15:12:13 compute-1 sshd-session[206692]: Disconnecting invalid user victor 185.246.128.170 port 11481: Change of username or service not allowed: (victor,ssh-connection) -> (sshadmin,ssh-connection) [preauth]
Jan 26 15:12:14 compute-1 nova_compute[183403]: 2026-01-26 15:12:14.120 183407 DEBUG oslo_concurrency.lockutils [None req-07238e0c-fe0b-4b8b-a8f7-b8cd795e128d afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:12:14 compute-1 nova_compute[183403]: 2026-01-26 15:12:14.121 183407 DEBUG oslo_concurrency.lockutils [None req-07238e0c-fe0b-4b8b-a8f7-b8cd795e128d afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:12:14 compute-1 nova_compute[183403]: 2026-01-26 15:12:14.289 183407 DEBUG nova.compute.provider_tree [None req-07238e0c-fe0b-4b8b-a8f7-b8cd795e128d afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:12:14 compute-1 nova_compute[183403]: 2026-01-26 15:12:14.470 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:14 compute-1 nova_compute[183403]: 2026-01-26 15:12:14.867 183407 DEBUG nova.scheduler.client.report [None req-07238e0c-fe0b-4b8b-a8f7-b8cd795e128d afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:12:15 compute-1 nova_compute[183403]: 2026-01-26 15:12:15.497 183407 DEBUG oslo_concurrency.lockutils [None req-07238e0c-fe0b-4b8b-a8f7-b8cd795e128d afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.376s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:12:16 compute-1 nova_compute[183403]: 2026-01-26 15:12:16.464 183407 INFO nova.scheduler.client.report [None req-07238e0c-fe0b-4b8b-a8f7-b8cd795e128d afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Deleted allocations for instance 981e6db3-c4e9-422b-91bb-a2c1c5869fc4
Jan 26 15:12:17 compute-1 nova_compute[183403]: 2026-01-26 15:12:17.607 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:18 compute-1 sshd-session[206694]: Invalid user sshadmin from 185.246.128.170 port 34987
Jan 26 15:12:18 compute-1 nova_compute[183403]: 2026-01-26 15:12:18.815 183407 DEBUG oslo_concurrency.lockutils [None req-07238e0c-fe0b-4b8b-a8f7-b8cd795e128d afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "981e6db3-c4e9-422b-91bb-a2c1c5869fc4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 13.017s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:12:19 compute-1 sshd-session[206694]: Disconnecting invalid user sshadmin 185.246.128.170 port 34987: Change of username or service not allowed: (sshadmin,ssh-connection) -> (debian,ssh-connection) [preauth]
Jan 26 15:12:19 compute-1 openstack_network_exporter[195610]: ERROR   15:12:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:12:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:12:19 compute-1 openstack_network_exporter[195610]: ERROR   15:12:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:12:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:12:19 compute-1 nova_compute[183403]: 2026-01-26 15:12:19.472 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:20 compute-1 nova_compute[183403]: 2026-01-26 15:12:20.720 183407 DEBUG oslo_concurrency.lockutils [None req-6fe1f3f2-9495-4179-9661-e3b2f22b5f2f afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Acquiring lock "7484f2cd-93e8-4578-9c4a-5bc1e0b49d10" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:12:20 compute-1 nova_compute[183403]: 2026-01-26 15:12:20.721 183407 DEBUG oslo_concurrency.lockutils [None req-6fe1f3f2-9495-4179-9661-e3b2f22b5f2f afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "7484f2cd-93e8-4578-9c4a-5bc1e0b49d10" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:12:20 compute-1 nova_compute[183403]: 2026-01-26 15:12:20.721 183407 DEBUG oslo_concurrency.lockutils [None req-6fe1f3f2-9495-4179-9661-e3b2f22b5f2f afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Acquiring lock "7484f2cd-93e8-4578-9c4a-5bc1e0b49d10-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:12:20 compute-1 nova_compute[183403]: 2026-01-26 15:12:20.721 183407 DEBUG oslo_concurrency.lockutils [None req-6fe1f3f2-9495-4179-9661-e3b2f22b5f2f afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "7484f2cd-93e8-4578-9c4a-5bc1e0b49d10-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:12:20 compute-1 nova_compute[183403]: 2026-01-26 15:12:20.721 183407 DEBUG oslo_concurrency.lockutils [None req-6fe1f3f2-9495-4179-9661-e3b2f22b5f2f afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "7484f2cd-93e8-4578-9c4a-5bc1e0b49d10-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:12:21 compute-1 nova_compute[183403]: 2026-01-26 15:12:21.245 183407 INFO nova.compute.manager [None req-6fe1f3f2-9495-4179-9661-e3b2f22b5f2f afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10] Terminating instance
Jan 26 15:12:21 compute-1 podman[206696]: 2026-01-26 15:12:21.901376132 +0000 UTC m=+0.069602714 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 26 15:12:21 compute-1 podman[206697]: 2026-01-26 15:12:21.904569016 +0000 UTC m=+0.070158009 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, architecture=x86_64, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, release=1755695350, vcs-type=git, managed_by=edpm_ansible, container_name=openstack_network_exporter, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container)
Jan 26 15:12:22 compute-1 nova_compute[183403]: 2026-01-26 15:12:22.014 183407 DEBUG nova.compute.manager [None req-6fe1f3f2-9495-4179-9661-e3b2f22b5f2f afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Jan 26 15:12:22 compute-1 kernel: tap2282f681-f2 (unregistering): left promiscuous mode
Jan 26 15:12:22 compute-1 NetworkManager[55716]: <info>  [1769440342.0512] device (tap2282f681-f2): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 15:12:22 compute-1 ovn_controller[95641]: 2026-01-26T15:12:22Z|00079|binding|INFO|Releasing lport 2282f681-f2e6-4601-a493-3244e752f7a4 from this chassis (sb_readonly=0)
Jan 26 15:12:22 compute-1 nova_compute[183403]: 2026-01-26 15:12:22.055 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:22 compute-1 ovn_controller[95641]: 2026-01-26T15:12:22Z|00080|binding|INFO|Setting lport 2282f681-f2e6-4601-a493-3244e752f7a4 down in Southbound
Jan 26 15:12:22 compute-1 ovn_controller[95641]: 2026-01-26T15:12:22Z|00081|binding|INFO|Removing iface tap2282f681-f2 ovn-installed in OVS
Jan 26 15:12:22 compute-1 nova_compute[183403]: 2026-01-26 15:12:22.060 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:22 compute-1 nova_compute[183403]: 2026-01-26 15:12:22.081 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:22 compute-1 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000008.scope: Deactivated successfully.
Jan 26 15:12:22 compute-1 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000008.scope: Consumed 14.052s CPU time.
Jan 26 15:12:22 compute-1 systemd-machined[154697]: Machine qemu-7-instance-00000008 terminated.
Jan 26 15:12:22 compute-1 nova_compute[183403]: 2026-01-26 15:12:22.240 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:22 compute-1 nova_compute[183403]: 2026-01-26 15:12:22.251 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:22 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:22.267 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:29:1b:91 10.100.0.3'], port_security=['fa:16:3e:29:1b:91 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '7484f2cd-93e8-4578-9c4a-5bc1e0b49d10', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d4a37c9f-5b64-4f94-80e9-126c911b1acf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6377892a338d4a7cbe63cf30bd2c63ea', 'neutron:revision_number': '10', 'neutron:security_group_ids': '6ec487f2-f407-43f7-8fd3-02f4d5e73158', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=922d1c2a-bc46-47ee-81d5-242719303ef7, chassis=[], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], logical_port=2282f681-f2e6-4601-a493-3244e752f7a4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:12:22 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:22.268 104930 INFO neutron.agent.ovn.metadata.agent [-] Port 2282f681-f2e6-4601-a493-3244e752f7a4 in datapath d4a37c9f-5b64-4f94-80e9-126c911b1acf unbound from our chassis
Jan 26 15:12:22 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:22.270 104930 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d4a37c9f-5b64-4f94-80e9-126c911b1acf
Jan 26 15:12:22 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:22.287 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[0b3d3cb4-2d22-4eaa-a05e-c0377d95db31]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:12:22 compute-1 nova_compute[183403]: 2026-01-26 15:12:22.296 183407 INFO nova.virt.libvirt.driver [-] [instance: 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10] Instance destroyed successfully.
Jan 26 15:12:22 compute-1 nova_compute[183403]: 2026-01-26 15:12:22.296 183407 DEBUG nova.objects.instance [None req-6fe1f3f2-9495-4179-9661-e3b2f22b5f2f afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lazy-loading 'resources' on Instance uuid 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:12:22 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:22.317 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[80968157-9aa9-48aa-a951-df326a893af9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:12:22 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:22.320 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[95774ca5-4b8d-4fc9-9103-179fdeafbc20]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:12:22 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:22.354 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[d39692ee-fbfc-45f2-816b-0bfe830fc115]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:12:22 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:22.368 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[485bd757-a2d6-4a93-a6d1-8a7e00f005fb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd4a37c9f-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:53:55:45'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 36, 'tx_packets': 17, 'rx_bytes': 1792, 'tx_bytes': 858, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 36, 'tx_packets': 17, 'rx_bytes': 1792, 'tx_bytes': 858, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 385314, 'reachable_time': 28113, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 206773, 'error': None, 'target': 'ovnmeta-d4a37c9f-5b64-4f94-80e9-126c911b1acf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:12:22 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:22.383 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[ad026759-53c7-4d30-bf92-13dd820a1190]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd4a37c9f-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 385324, 'tstamp': 385324}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 206774, 'error': None, 'target': 'ovnmeta-d4a37c9f-5b64-4f94-80e9-126c911b1acf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd4a37c9f-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 385327, 'tstamp': 385327}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 206774, 'error': None, 'target': 'ovnmeta-d4a37c9f-5b64-4f94-80e9-126c911b1acf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:12:22 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:22.384 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd4a37c9f-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:12:22 compute-1 nova_compute[183403]: 2026-01-26 15:12:22.408 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:22 compute-1 nova_compute[183403]: 2026-01-26 15:12:22.412 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:22 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:22.412 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd4a37c9f-50, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:12:22 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:22.412 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:12:22 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:22.413 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd4a37c9f-50, col_values=(('external_ids', {'iface-id': '3415b7f1-5b64-48d1-b20f-4c68422efc0e'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:12:22 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:22.413 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:12:22 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:22.414 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[0ff5a27d-29ec-4c3f-abd5-597424451a38]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-d4a37c9f-5b64-4f94-80e9-126c911b1acf\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/d4a37c9f-5b64-4f94-80e9-126c911b1acf.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID d4a37c9f-5b64-4f94-80e9-126c911b1acf\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:12:22 compute-1 nova_compute[183403]: 2026-01-26 15:12:22.609 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:22 compute-1 nova_compute[183403]: 2026-01-26 15:12:22.833 183407 DEBUG nova.virt.libvirt.vif [None req-6fe1f3f2-9495-4179-9661-e3b2f22b5f2f afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2026-01-26T15:10:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1496374934',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1496374934',id=8,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T15:11:44Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6377892a338d4a7cbe63cf30bd2c63ea',ramdisk_id='',reservation_id='r-2b7ioems',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,reader,member,admin',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-280856547',owner_user_name='tempest-TestExecuteActionsViaActuator-280856547-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T15:12:01Z,user_data=None,user_id='afb4f4811cb043dca89a8413c390ba3d',uuid=7484f2cd-93e8-4578-9c4a-5bc1e0b49d10,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2282f681-f2e6-4601-a493-3244e752f7a4", "address": "fa:16:3e:29:1b:91", "network": {"id": "d4a37c9f-5b64-4f94-80e9-126c911b1acf", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7dc5e9070a084dfcb543a08e87868f39", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2282f681-f2", "ovs_interfaceid": "2282f681-f2e6-4601-a493-3244e752f7a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 26 15:12:22 compute-1 nova_compute[183403]: 2026-01-26 15:12:22.834 183407 DEBUG nova.network.os_vif_util [None req-6fe1f3f2-9495-4179-9661-e3b2f22b5f2f afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Converting VIF {"id": "2282f681-f2e6-4601-a493-3244e752f7a4", "address": "fa:16:3e:29:1b:91", "network": {"id": "d4a37c9f-5b64-4f94-80e9-126c911b1acf", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7dc5e9070a084dfcb543a08e87868f39", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2282f681-f2", "ovs_interfaceid": "2282f681-f2e6-4601-a493-3244e752f7a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:12:22 compute-1 nova_compute[183403]: 2026-01-26 15:12:22.835 183407 DEBUG nova.network.os_vif_util [None req-6fe1f3f2-9495-4179-9661-e3b2f22b5f2f afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:29:1b:91,bridge_name='br-int',has_traffic_filtering=True,id=2282f681-f2e6-4601-a493-3244e752f7a4,network=Network(d4a37c9f-5b64-4f94-80e9-126c911b1acf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2282f681-f2') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:12:22 compute-1 nova_compute[183403]: 2026-01-26 15:12:22.835 183407 DEBUG os_vif [None req-6fe1f3f2-9495-4179-9661-e3b2f22b5f2f afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:29:1b:91,bridge_name='br-int',has_traffic_filtering=True,id=2282f681-f2e6-4601-a493-3244e752f7a4,network=Network(d4a37c9f-5b64-4f94-80e9-126c911b1acf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2282f681-f2') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 26 15:12:22 compute-1 nova_compute[183403]: 2026-01-26 15:12:22.837 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:22 compute-1 nova_compute[183403]: 2026-01-26 15:12:22.837 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2282f681-f2, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:12:22 compute-1 nova_compute[183403]: 2026-01-26 15:12:22.839 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:22 compute-1 nova_compute[183403]: 2026-01-26 15:12:22.841 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:12:22 compute-1 nova_compute[183403]: 2026-01-26 15:12:22.843 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:22 compute-1 nova_compute[183403]: 2026-01-26 15:12:22.844 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=8621c5dd-108a-426c-8407-84b7934469e1) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:12:22 compute-1 nova_compute[183403]: 2026-01-26 15:12:22.844 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:22 compute-1 nova_compute[183403]: 2026-01-26 15:12:22.845 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:22 compute-1 nova_compute[183403]: 2026-01-26 15:12:22.850 183407 INFO os_vif [None req-6fe1f3f2-9495-4179-9661-e3b2f22b5f2f afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:29:1b:91,bridge_name='br-int',has_traffic_filtering=True,id=2282f681-f2e6-4601-a493-3244e752f7a4,network=Network(d4a37c9f-5b64-4f94-80e9-126c911b1acf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2282f681-f2')
Jan 26 15:12:22 compute-1 nova_compute[183403]: 2026-01-26 15:12:22.851 183407 INFO nova.virt.libvirt.driver [None req-6fe1f3f2-9495-4179-9661-e3b2f22b5f2f afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10] Deleting instance files /var/lib/nova/instances/7484f2cd-93e8-4578-9c4a-5bc1e0b49d10_del
Jan 26 15:12:22 compute-1 nova_compute[183403]: 2026-01-26 15:12:22.854 183407 INFO nova.virt.libvirt.driver [None req-6fe1f3f2-9495-4179-9661-e3b2f22b5f2f afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10] Deletion of /var/lib/nova/instances/7484f2cd-93e8-4578-9c4a-5bc1e0b49d10_del complete
Jan 26 15:12:24 compute-1 nova_compute[183403]: 2026-01-26 15:12:24.520 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:24 compute-1 nova_compute[183403]: 2026-01-26 15:12:24.525 183407 INFO nova.compute.manager [None req-6fe1f3f2-9495-4179-9661-e3b2f22b5f2f afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10] Took 2.51 seconds to destroy the instance on the hypervisor.
Jan 26 15:12:24 compute-1 nova_compute[183403]: 2026-01-26 15:12:24.525 183407 DEBUG oslo.service.backend._eventlet.loopingcall [None req-6fe1f3f2-9495-4179-9661-e3b2f22b5f2f afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Jan 26 15:12:24 compute-1 nova_compute[183403]: 2026-01-26 15:12:24.526 183407 DEBUG nova.compute.manager [-] [instance: 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Jan 26 15:12:24 compute-1 nova_compute[183403]: 2026-01-26 15:12:24.526 183407 DEBUG nova.network.neutron [-] [instance: 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Jan 26 15:12:24 compute-1 nova_compute[183403]: 2026-01-26 15:12:24.526 183407 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:12:24 compute-1 nova_compute[183403]: 2026-01-26 15:12:24.616 183407 DEBUG nova.compute.manager [req-6938de48-0cdd-4617-a0b0-f1e35348a545 req-d6c63cbd-bd38-4539-96dc-717fb99b0d54 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10] Received event network-vif-unplugged-2282f681-f2e6-4601-a493-3244e752f7a4 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:12:24 compute-1 nova_compute[183403]: 2026-01-26 15:12:24.616 183407 DEBUG oslo_concurrency.lockutils [req-6938de48-0cdd-4617-a0b0-f1e35348a545 req-d6c63cbd-bd38-4539-96dc-717fb99b0d54 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "7484f2cd-93e8-4578-9c4a-5bc1e0b49d10-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:12:24 compute-1 nova_compute[183403]: 2026-01-26 15:12:24.616 183407 DEBUG oslo_concurrency.lockutils [req-6938de48-0cdd-4617-a0b0-f1e35348a545 req-d6c63cbd-bd38-4539-96dc-717fb99b0d54 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "7484f2cd-93e8-4578-9c4a-5bc1e0b49d10-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:12:24 compute-1 nova_compute[183403]: 2026-01-26 15:12:24.616 183407 DEBUG oslo_concurrency.lockutils [req-6938de48-0cdd-4617-a0b0-f1e35348a545 req-d6c63cbd-bd38-4539-96dc-717fb99b0d54 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "7484f2cd-93e8-4578-9c4a-5bc1e0b49d10-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:12:24 compute-1 nova_compute[183403]: 2026-01-26 15:12:24.617 183407 DEBUG nova.compute.manager [req-6938de48-0cdd-4617-a0b0-f1e35348a545 req-d6c63cbd-bd38-4539-96dc-717fb99b0d54 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10] No waiting events found dispatching network-vif-unplugged-2282f681-f2e6-4601-a493-3244e752f7a4 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:12:24 compute-1 nova_compute[183403]: 2026-01-26 15:12:24.617 183407 DEBUG nova.compute.manager [req-6938de48-0cdd-4617-a0b0-f1e35348a545 req-d6c63cbd-bd38-4539-96dc-717fb99b0d54 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10] Received event network-vif-unplugged-2282f681-f2e6-4601-a493-3244e752f7a4 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 15:12:25 compute-1 sshd-session[206751]: Invalid user debian from 185.246.128.170 port 57093
Jan 26 15:12:25 compute-1 nova_compute[183403]: 2026-01-26 15:12:25.124 183407 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:12:25 compute-1 sshd-session[206751]: Disconnecting invalid user debian 185.246.128.170 port 57093: Change of username or service not allowed: (debian,ssh-connection) -> (hugo,ssh-connection) [preauth]
Jan 26 15:12:26 compute-1 nova_compute[183403]: 2026-01-26 15:12:26.796 183407 DEBUG nova.compute.manager [req-a9b86cea-a196-4c77-86b9-830829d7a9f4 req-33dae3ef-dc73-4ffa-aa3f-98c174873a2d 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10] Received event network-vif-unplugged-2282f681-f2e6-4601-a493-3244e752f7a4 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:12:26 compute-1 nova_compute[183403]: 2026-01-26 15:12:26.796 183407 DEBUG oslo_concurrency.lockutils [req-a9b86cea-a196-4c77-86b9-830829d7a9f4 req-33dae3ef-dc73-4ffa-aa3f-98c174873a2d 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "7484f2cd-93e8-4578-9c4a-5bc1e0b49d10-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:12:26 compute-1 nova_compute[183403]: 2026-01-26 15:12:26.796 183407 DEBUG oslo_concurrency.lockutils [req-a9b86cea-a196-4c77-86b9-830829d7a9f4 req-33dae3ef-dc73-4ffa-aa3f-98c174873a2d 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "7484f2cd-93e8-4578-9c4a-5bc1e0b49d10-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:12:26 compute-1 nova_compute[183403]: 2026-01-26 15:12:26.797 183407 DEBUG oslo_concurrency.lockutils [req-a9b86cea-a196-4c77-86b9-830829d7a9f4 req-33dae3ef-dc73-4ffa-aa3f-98c174873a2d 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "7484f2cd-93e8-4578-9c4a-5bc1e0b49d10-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:12:26 compute-1 nova_compute[183403]: 2026-01-26 15:12:26.797 183407 DEBUG nova.compute.manager [req-a9b86cea-a196-4c77-86b9-830829d7a9f4 req-33dae3ef-dc73-4ffa-aa3f-98c174873a2d 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10] No waiting events found dispatching network-vif-unplugged-2282f681-f2e6-4601-a493-3244e752f7a4 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:12:26 compute-1 nova_compute[183403]: 2026-01-26 15:12:26.797 183407 DEBUG nova.compute.manager [req-a9b86cea-a196-4c77-86b9-830829d7a9f4 req-33dae3ef-dc73-4ffa-aa3f-98c174873a2d 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10] Received event network-vif-unplugged-2282f681-f2e6-4601-a493-3244e752f7a4 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 15:12:27 compute-1 nova_compute[183403]: 2026-01-26 15:12:27.804 183407 DEBUG nova.compute.manager [req-863dbd25-c22a-49f6-abf4-34c4081b177d req-dd8ea0eb-c0ac-4f59-a657-2ca72217e5db 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10] Received event network-vif-deleted-2282f681-f2e6-4601-a493-3244e752f7a4 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:12:27 compute-1 nova_compute[183403]: 2026-01-26 15:12:27.804 183407 INFO nova.compute.manager [req-863dbd25-c22a-49f6-abf4-34c4081b177d req-dd8ea0eb-c0ac-4f59-a657-2ca72217e5db 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10] Neutron deleted interface 2282f681-f2e6-4601-a493-3244e752f7a4; detaching it from the instance and deleting it from the info cache
Jan 26 15:12:27 compute-1 nova_compute[183403]: 2026-01-26 15:12:27.805 183407 DEBUG nova.network.neutron [req-863dbd25-c22a-49f6-abf4-34c4081b177d req-dd8ea0eb-c0ac-4f59-a657-2ca72217e5db 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:12:27 compute-1 nova_compute[183403]: 2026-01-26 15:12:27.845 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:28 compute-1 nova_compute[183403]: 2026-01-26 15:12:28.339 183407 DEBUG nova.network.neutron [-] [instance: 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:12:28 compute-1 nova_compute[183403]: 2026-01-26 15:12:28.353 183407 DEBUG nova.compute.manager [req-863dbd25-c22a-49f6-abf4-34c4081b177d req-dd8ea0eb-c0ac-4f59-a657-2ca72217e5db 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10] Detach interface failed, port_id=2282f681-f2e6-4601-a493-3244e752f7a4, reason: Instance 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Jan 26 15:12:28 compute-1 nova_compute[183403]: 2026-01-26 15:12:28.858 183407 INFO nova.compute.manager [-] [instance: 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10] Took 4.33 seconds to deallocate network for instance.
Jan 26 15:12:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:29.036 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:12:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:29.037 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:12:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:29.038 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:12:29 compute-1 nova_compute[183403]: 2026-01-26 15:12:29.523 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:29 compute-1 nova_compute[183403]: 2026-01-26 15:12:29.655 183407 DEBUG oslo_concurrency.lockutils [None req-6fe1f3f2-9495-4179-9661-e3b2f22b5f2f afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:12:29 compute-1 nova_compute[183403]: 2026-01-26 15:12:29.655 183407 DEBUG oslo_concurrency.lockutils [None req-6fe1f3f2-9495-4179-9661-e3b2f22b5f2f afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:12:29 compute-1 nova_compute[183403]: 2026-01-26 15:12:29.768 183407 DEBUG nova.compute.provider_tree [None req-6fe1f3f2-9495-4179-9661-e3b2f22b5f2f afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:12:30 compute-1 nova_compute[183403]: 2026-01-26 15:12:30.452 183407 DEBUG nova.scheduler.client.report [None req-6fe1f3f2-9495-4179-9661-e3b2f22b5f2f afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:12:30 compute-1 nova_compute[183403]: 2026-01-26 15:12:30.968 183407 DEBUG oslo_concurrency.lockutils [None req-6fe1f3f2-9495-4179-9661-e3b2f22b5f2f afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.313s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:12:31 compute-1 nova_compute[183403]: 2026-01-26 15:12:31.001 183407 INFO nova.scheduler.client.report [None req-6fe1f3f2-9495-4179-9661-e3b2f22b5f2f afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Deleted allocations for instance 7484f2cd-93e8-4578-9c4a-5bc1e0b49d10
Jan 26 15:12:31 compute-1 podman[206779]: 2026-01-26 15:12:31.895239529 +0000 UTC m=+0.064694325 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 15:12:31 compute-1 podman[206778]: 2026-01-26 15:12:31.930091597 +0000 UTC m=+0.102956933 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 15:12:32 compute-1 nova_compute[183403]: 2026-01-26 15:12:32.186 183407 DEBUG oslo_concurrency.lockutils [None req-6fe1f3f2-9495-4179-9661-e3b2f22b5f2f afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "7484f2cd-93e8-4578-9c4a-5bc1e0b49d10" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 11.465s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:12:32 compute-1 nova_compute[183403]: 2026-01-26 15:12:32.847 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:33 compute-1 nova_compute[183403]: 2026-01-26 15:12:33.336 183407 DEBUG oslo_concurrency.lockutils [None req-6bf8755c-f548-4fb2-9a81-5ccc8dc13aca afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Acquiring lock "aab8c28e-0489-40bd-88cf-5eb7c419933a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:12:33 compute-1 nova_compute[183403]: 2026-01-26 15:12:33.336 183407 DEBUG oslo_concurrency.lockutils [None req-6bf8755c-f548-4fb2-9a81-5ccc8dc13aca afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "aab8c28e-0489-40bd-88cf-5eb7c419933a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:12:33 compute-1 nova_compute[183403]: 2026-01-26 15:12:33.337 183407 DEBUG oslo_concurrency.lockutils [None req-6bf8755c-f548-4fb2-9a81-5ccc8dc13aca afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Acquiring lock "aab8c28e-0489-40bd-88cf-5eb7c419933a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:12:33 compute-1 nova_compute[183403]: 2026-01-26 15:12:33.337 183407 DEBUG oslo_concurrency.lockutils [None req-6bf8755c-f548-4fb2-9a81-5ccc8dc13aca afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "aab8c28e-0489-40bd-88cf-5eb7c419933a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:12:33 compute-1 nova_compute[183403]: 2026-01-26 15:12:33.337 183407 DEBUG oslo_concurrency.lockutils [None req-6bf8755c-f548-4fb2-9a81-5ccc8dc13aca afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "aab8c28e-0489-40bd-88cf-5eb7c419933a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:12:33 compute-1 nova_compute[183403]: 2026-01-26 15:12:33.354 183407 INFO nova.compute.manager [None req-6bf8755c-f548-4fb2-9a81-5ccc8dc13aca afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] Terminating instance
Jan 26 15:12:33 compute-1 nova_compute[183403]: 2026-01-26 15:12:33.869 183407 DEBUG nova.compute.manager [None req-6bf8755c-f548-4fb2-9a81-5ccc8dc13aca afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Jan 26 15:12:33 compute-1 kernel: tapbeb27c83-cf (unregistering): left promiscuous mode
Jan 26 15:12:33 compute-1 NetworkManager[55716]: <info>  [1769440353.9019] device (tapbeb27c83-cf): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 15:12:33 compute-1 ovn_controller[95641]: 2026-01-26T15:12:33Z|00082|binding|INFO|Releasing lport beb27c83-cf29-484e-a6e9-e9c6a978afd5 from this chassis (sb_readonly=0)
Jan 26 15:12:33 compute-1 nova_compute[183403]: 2026-01-26 15:12:33.906 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:33 compute-1 ovn_controller[95641]: 2026-01-26T15:12:33Z|00083|binding|INFO|Setting lport beb27c83-cf29-484e-a6e9-e9c6a978afd5 down in Southbound
Jan 26 15:12:33 compute-1 ovn_controller[95641]: 2026-01-26T15:12:33Z|00084|binding|INFO|Removing iface tapbeb27c83-cf ovn-installed in OVS
Jan 26 15:12:33 compute-1 nova_compute[183403]: 2026-01-26 15:12:33.909 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:33 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:33.917 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8b:22:83 10.100.0.14'], port_security=['fa:16:3e:8b:22:83 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'aab8c28e-0489-40bd-88cf-5eb7c419933a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d4a37c9f-5b64-4f94-80e9-126c911b1acf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6377892a338d4a7cbe63cf30bd2c63ea', 'neutron:revision_number': '5', 'neutron:security_group_ids': '6ec487f2-f407-43f7-8fd3-02f4d5e73158', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=922d1c2a-bc46-47ee-81d5-242719303ef7, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], logical_port=beb27c83-cf29-484e-a6e9-e9c6a978afd5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:12:33 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:33.917 104930 INFO neutron.agent.ovn.metadata.agent [-] Port beb27c83-cf29-484e-a6e9-e9c6a978afd5 in datapath d4a37c9f-5b64-4f94-80e9-126c911b1acf unbound from our chassis
Jan 26 15:12:33 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:33.918 104930 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d4a37c9f-5b64-4f94-80e9-126c911b1acf
Jan 26 15:12:33 compute-1 nova_compute[183403]: 2026-01-26 15:12:33.924 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:33 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:33.937 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[25266142-05b7-4ac0-a77d-030c8b9dec81]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:12:33 compute-1 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000007.scope: Deactivated successfully.
Jan 26 15:12:33 compute-1 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000007.scope: Consumed 18.551s CPU time.
Jan 26 15:12:33 compute-1 systemd-machined[154697]: Machine qemu-4-instance-00000007 terminated.
Jan 26 15:12:33 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:33.969 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[aef64191-9ca9-46d6-bd20-19ec39b24846]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:12:33 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:33.972 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[9f25383b-607d-4c6f-8b78-dc9a7ac65cd5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:12:34 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:34.011 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[b4edc5d3-2870-4beb-a8a5-f0c1ba303ee5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:12:34 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:34.034 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[c6983407-b16d-40cd-a974-29ec404ea7cd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd4a37c9f-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:53:55:45'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 36, 'tx_packets': 19, 'rx_bytes': 1792, 'tx_bytes': 942, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 36, 'tx_packets': 19, 'rx_bytes': 1792, 'tx_bytes': 942, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 385314, 'reachable_time': 28113, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 206837, 'error': None, 'target': 'ovnmeta-d4a37c9f-5b64-4f94-80e9-126c911b1acf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:12:34 compute-1 nova_compute[183403]: 2026-01-26 15:12:34.057 183407 DEBUG nova.compute.manager [req-b72b8caa-c2d9-4a71-ab51-70a6bf39ff1d req-062b2995-364d-4947-a5a4-c1811b453372 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] Received event network-vif-unplugged-beb27c83-cf29-484e-a6e9-e9c6a978afd5 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:12:34 compute-1 nova_compute[183403]: 2026-01-26 15:12:34.058 183407 DEBUG oslo_concurrency.lockutils [req-b72b8caa-c2d9-4a71-ab51-70a6bf39ff1d req-062b2995-364d-4947-a5a4-c1811b453372 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "aab8c28e-0489-40bd-88cf-5eb7c419933a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:12:34 compute-1 nova_compute[183403]: 2026-01-26 15:12:34.058 183407 DEBUG oslo_concurrency.lockutils [req-b72b8caa-c2d9-4a71-ab51-70a6bf39ff1d req-062b2995-364d-4947-a5a4-c1811b453372 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "aab8c28e-0489-40bd-88cf-5eb7c419933a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:12:34 compute-1 nova_compute[183403]: 2026-01-26 15:12:34.059 183407 DEBUG oslo_concurrency.lockutils [req-b72b8caa-c2d9-4a71-ab51-70a6bf39ff1d req-062b2995-364d-4947-a5a4-c1811b453372 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "aab8c28e-0489-40bd-88cf-5eb7c419933a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:12:34 compute-1 nova_compute[183403]: 2026-01-26 15:12:34.059 183407 DEBUG nova.compute.manager [req-b72b8caa-c2d9-4a71-ab51-70a6bf39ff1d req-062b2995-364d-4947-a5a4-c1811b453372 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] No waiting events found dispatching network-vif-unplugged-beb27c83-cf29-484e-a6e9-e9c6a978afd5 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:12:34 compute-1 nova_compute[183403]: 2026-01-26 15:12:34.060 183407 DEBUG nova.compute.manager [req-b72b8caa-c2d9-4a71-ab51-70a6bf39ff1d req-062b2995-364d-4947-a5a4-c1811b453372 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] Received event network-vif-unplugged-beb27c83-cf29-484e-a6e9-e9c6a978afd5 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 15:12:34 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:34.060 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[0059ea62-e9d8-4d67-a5f6-243a22c8d3c7]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd4a37c9f-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 385324, 'tstamp': 385324}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 206838, 'error': None, 'target': 'ovnmeta-d4a37c9f-5b64-4f94-80e9-126c911b1acf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd4a37c9f-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 385327, 'tstamp': 385327}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 206838, 'error': None, 'target': 'ovnmeta-d4a37c9f-5b64-4f94-80e9-126c911b1acf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:12:34 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:34.062 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd4a37c9f-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:12:34 compute-1 nova_compute[183403]: 2026-01-26 15:12:34.064 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:34 compute-1 nova_compute[183403]: 2026-01-26 15:12:34.068 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:34 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:34.069 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd4a37c9f-50, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:12:34 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:34.069 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:12:34 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:34.070 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd4a37c9f-50, col_values=(('external_ids', {'iface-id': '3415b7f1-5b64-48d1-b20f-4c68422efc0e'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:12:34 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:34.070 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:12:34 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:34.072 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[c4281dfa-e56d-4f61-9a28-b4e92930b79c]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-d4a37c9f-5b64-4f94-80e9-126c911b1acf\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/d4a37c9f-5b64-4f94-80e9-126c911b1acf.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID d4a37c9f-5b64-4f94-80e9-126c911b1acf\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:12:34 compute-1 nova_compute[183403]: 2026-01-26 15:12:34.090 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:34 compute-1 nova_compute[183403]: 2026-01-26 15:12:34.096 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:34 compute-1 nova_compute[183403]: 2026-01-26 15:12:34.139 183407 INFO nova.virt.libvirt.driver [-] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] Instance destroyed successfully.
Jan 26 15:12:34 compute-1 nova_compute[183403]: 2026-01-26 15:12:34.140 183407 DEBUG nova.objects.instance [None req-6bf8755c-f548-4fb2-9a81-5ccc8dc13aca afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lazy-loading 'resources' on Instance uuid aab8c28e-0489-40bd-88cf-5eb7c419933a obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:12:34 compute-1 nova_compute[183403]: 2026-01-26 15:12:34.526 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:34 compute-1 nova_compute[183403]: 2026-01-26 15:12:34.648 183407 DEBUG nova.virt.libvirt.vif [None req-6bf8755c-f548-4fb2-9a81-5ccc8dc13aca afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2026-01-26T15:09:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-368148754',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-368148754',id=7,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T15:10:16Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6377892a338d4a7cbe63cf30bd2c63ea',ramdisk_id='',reservation_id='r-nrh6656l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,reader,member,admin',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-280856547',owner_user_name='tempest-TestExecuteActionsViaActuator-280856547-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T15:10:16Z,user_data=None,user_id='afb4f4811cb043dca89a8413c390ba3d',uuid=aab8c28e-0489-40bd-88cf-5eb7c419933a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "beb27c83-cf29-484e-a6e9-e9c6a978afd5", "address": "fa:16:3e:8b:22:83", "network": {"id": "d4a37c9f-5b64-4f94-80e9-126c911b1acf", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7dc5e9070a084dfcb543a08e87868f39", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbeb27c83-cf", "ovs_interfaceid": "beb27c83-cf29-484e-a6e9-e9c6a978afd5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 26 15:12:34 compute-1 nova_compute[183403]: 2026-01-26 15:12:34.648 183407 DEBUG nova.network.os_vif_util [None req-6bf8755c-f548-4fb2-9a81-5ccc8dc13aca afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Converting VIF {"id": "beb27c83-cf29-484e-a6e9-e9c6a978afd5", "address": "fa:16:3e:8b:22:83", "network": {"id": "d4a37c9f-5b64-4f94-80e9-126c911b1acf", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7dc5e9070a084dfcb543a08e87868f39", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbeb27c83-cf", "ovs_interfaceid": "beb27c83-cf29-484e-a6e9-e9c6a978afd5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:12:34 compute-1 nova_compute[183403]: 2026-01-26 15:12:34.649 183407 DEBUG nova.network.os_vif_util [None req-6bf8755c-f548-4fb2-9a81-5ccc8dc13aca afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8b:22:83,bridge_name='br-int',has_traffic_filtering=True,id=beb27c83-cf29-484e-a6e9-e9c6a978afd5,network=Network(d4a37c9f-5b64-4f94-80e9-126c911b1acf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbeb27c83-cf') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:12:34 compute-1 nova_compute[183403]: 2026-01-26 15:12:34.649 183407 DEBUG os_vif [None req-6bf8755c-f548-4fb2-9a81-5ccc8dc13aca afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:22:83,bridge_name='br-int',has_traffic_filtering=True,id=beb27c83-cf29-484e-a6e9-e9c6a978afd5,network=Network(d4a37c9f-5b64-4f94-80e9-126c911b1acf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbeb27c83-cf') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 26 15:12:34 compute-1 nova_compute[183403]: 2026-01-26 15:12:34.650 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:34 compute-1 nova_compute[183403]: 2026-01-26 15:12:34.651 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbeb27c83-cf, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:12:34 compute-1 nova_compute[183403]: 2026-01-26 15:12:34.652 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:34 compute-1 nova_compute[183403]: 2026-01-26 15:12:34.654 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:34 compute-1 nova_compute[183403]: 2026-01-26 15:12:34.654 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:34 compute-1 nova_compute[183403]: 2026-01-26 15:12:34.655 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=f3ab9540-8705-49df-a846-bccf7b6ee2f1) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:12:34 compute-1 nova_compute[183403]: 2026-01-26 15:12:34.655 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:34 compute-1 nova_compute[183403]: 2026-01-26 15:12:34.656 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:34 compute-1 nova_compute[183403]: 2026-01-26 15:12:34.658 183407 INFO os_vif [None req-6bf8755c-f548-4fb2-9a81-5ccc8dc13aca afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:22:83,bridge_name='br-int',has_traffic_filtering=True,id=beb27c83-cf29-484e-a6e9-e9c6a978afd5,network=Network(d4a37c9f-5b64-4f94-80e9-126c911b1acf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbeb27c83-cf')
Jan 26 15:12:34 compute-1 nova_compute[183403]: 2026-01-26 15:12:34.659 183407 INFO nova.virt.libvirt.driver [None req-6bf8755c-f548-4fb2-9a81-5ccc8dc13aca afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] Deleting instance files /var/lib/nova/instances/aab8c28e-0489-40bd-88cf-5eb7c419933a_del
Jan 26 15:12:34 compute-1 nova_compute[183403]: 2026-01-26 15:12:34.660 183407 INFO nova.virt.libvirt.driver [None req-6bf8755c-f548-4fb2-9a81-5ccc8dc13aca afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] Deletion of /var/lib/nova/instances/aab8c28e-0489-40bd-88cf-5eb7c419933a_del complete
Jan 26 15:12:35 compute-1 nova_compute[183403]: 2026-01-26 15:12:35.172 183407 INFO nova.compute.manager [None req-6bf8755c-f548-4fb2-9a81-5ccc8dc13aca afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] Took 1.30 seconds to destroy the instance on the hypervisor.
Jan 26 15:12:35 compute-1 nova_compute[183403]: 2026-01-26 15:12:35.172 183407 DEBUG oslo.service.backend._eventlet.loopingcall [None req-6bf8755c-f548-4fb2-9a81-5ccc8dc13aca afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Jan 26 15:12:35 compute-1 nova_compute[183403]: 2026-01-26 15:12:35.173 183407 DEBUG nova.compute.manager [-] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Jan 26 15:12:35 compute-1 nova_compute[183403]: 2026-01-26 15:12:35.173 183407 DEBUG nova.network.neutron [-] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Jan 26 15:12:35 compute-1 nova_compute[183403]: 2026-01-26 15:12:35.173 183407 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:12:35 compute-1 podman[192725]: time="2026-01-26T15:12:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:12:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:12:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16574 "" "Go-http-client/1.1"
Jan 26 15:12:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:12:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2649 "" "Go-http-client/1.1"
Jan 26 15:12:35 compute-1 sshd-session[206822]: Invalid user hugo from 185.246.128.170 port 53958
Jan 26 15:12:35 compute-1 sshd-session[206822]: Disconnecting invalid user hugo 185.246.128.170 port 53958: Change of username or service not allowed: (hugo,ssh-connection) -> (tomcat,ssh-connection) [preauth]
Jan 26 15:12:36 compute-1 nova_compute[183403]: 2026-01-26 15:12:36.164 183407 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:12:36 compute-1 nova_compute[183403]: 2026-01-26 15:12:36.173 183407 DEBUG nova.compute.manager [req-1a1b9a48-103f-4d23-8ae2-351e916fa4a2 req-c9e58fec-2aad-455d-916c-dc8bf8ec3619 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] Received event network-vif-unplugged-beb27c83-cf29-484e-a6e9-e9c6a978afd5 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:12:36 compute-1 nova_compute[183403]: 2026-01-26 15:12:36.174 183407 DEBUG oslo_concurrency.lockutils [req-1a1b9a48-103f-4d23-8ae2-351e916fa4a2 req-c9e58fec-2aad-455d-916c-dc8bf8ec3619 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "aab8c28e-0489-40bd-88cf-5eb7c419933a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:12:36 compute-1 nova_compute[183403]: 2026-01-26 15:12:36.174 183407 DEBUG oslo_concurrency.lockutils [req-1a1b9a48-103f-4d23-8ae2-351e916fa4a2 req-c9e58fec-2aad-455d-916c-dc8bf8ec3619 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "aab8c28e-0489-40bd-88cf-5eb7c419933a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:12:36 compute-1 nova_compute[183403]: 2026-01-26 15:12:36.174 183407 DEBUG oslo_concurrency.lockutils [req-1a1b9a48-103f-4d23-8ae2-351e916fa4a2 req-c9e58fec-2aad-455d-916c-dc8bf8ec3619 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "aab8c28e-0489-40bd-88cf-5eb7c419933a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:12:36 compute-1 nova_compute[183403]: 2026-01-26 15:12:36.174 183407 DEBUG nova.compute.manager [req-1a1b9a48-103f-4d23-8ae2-351e916fa4a2 req-c9e58fec-2aad-455d-916c-dc8bf8ec3619 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] No waiting events found dispatching network-vif-unplugged-beb27c83-cf29-484e-a6e9-e9c6a978afd5 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:12:36 compute-1 nova_compute[183403]: 2026-01-26 15:12:36.174 183407 DEBUG nova.compute.manager [req-1a1b9a48-103f-4d23-8ae2-351e916fa4a2 req-c9e58fec-2aad-455d-916c-dc8bf8ec3619 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] Received event network-vif-unplugged-beb27c83-cf29-484e-a6e9-e9c6a978afd5 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 15:12:37 compute-1 nova_compute[183403]: 2026-01-26 15:12:37.332 183407 DEBUG nova.network.neutron [-] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:12:37 compute-1 nova_compute[183403]: 2026-01-26 15:12:37.840 183407 INFO nova.compute.manager [-] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] Took 2.67 seconds to deallocate network for instance.
Jan 26 15:12:38 compute-1 nova_compute[183403]: 2026-01-26 15:12:38.239 183407 DEBUG nova.compute.manager [req-796d98ae-2aaa-4619-854e-87277847abab req-f9a52693-4110-4422-893c-43da591f9b4d 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: aab8c28e-0489-40bd-88cf-5eb7c419933a] Received event network-vif-deleted-beb27c83-cf29-484e-a6e9-e9c6a978afd5 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:12:38 compute-1 nova_compute[183403]: 2026-01-26 15:12:38.361 183407 DEBUG oslo_concurrency.lockutils [None req-6bf8755c-f548-4fb2-9a81-5ccc8dc13aca afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:12:38 compute-1 nova_compute[183403]: 2026-01-26 15:12:38.362 183407 DEBUG oslo_concurrency.lockutils [None req-6bf8755c-f548-4fb2-9a81-5ccc8dc13aca afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:12:38 compute-1 nova_compute[183403]: 2026-01-26 15:12:38.466 183407 DEBUG nova.compute.provider_tree [None req-6bf8755c-f548-4fb2-9a81-5ccc8dc13aca afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:12:38 compute-1 nova_compute[183403]: 2026-01-26 15:12:38.974 183407 DEBUG nova.scheduler.client.report [None req-6bf8755c-f548-4fb2-9a81-5ccc8dc13aca afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:12:39 compute-1 nova_compute[183403]: 2026-01-26 15:12:39.485 183407 DEBUG oslo_concurrency.lockutils [None req-6bf8755c-f548-4fb2-9a81-5ccc8dc13aca afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.123s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:12:39 compute-1 nova_compute[183403]: 2026-01-26 15:12:39.508 183407 INFO nova.scheduler.client.report [None req-6bf8755c-f548-4fb2-9a81-5ccc8dc13aca afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Deleted allocations for instance aab8c28e-0489-40bd-88cf-5eb7c419933a
Jan 26 15:12:39 compute-1 nova_compute[183403]: 2026-01-26 15:12:39.527 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:39 compute-1 nova_compute[183403]: 2026-01-26 15:12:39.656 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:40 compute-1 nova_compute[183403]: 2026-01-26 15:12:40.544 183407 DEBUG oslo_concurrency.lockutils [None req-6bf8755c-f548-4fb2-9a81-5ccc8dc13aca afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "aab8c28e-0489-40bd-88cf-5eb7c419933a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.208s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:12:41 compute-1 nova_compute[183403]: 2026-01-26 15:12:41.252 183407 DEBUG oslo_concurrency.lockutils [None req-97744130-53ff-4fad-9457-80d2c8fdb1f2 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Acquiring lock "6532f73e-4d35-42af-b257-7ddf3cd08929" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:12:41 compute-1 nova_compute[183403]: 2026-01-26 15:12:41.253 183407 DEBUG oslo_concurrency.lockutils [None req-97744130-53ff-4fad-9457-80d2c8fdb1f2 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "6532f73e-4d35-42af-b257-7ddf3cd08929" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:12:41 compute-1 nova_compute[183403]: 2026-01-26 15:12:41.253 183407 DEBUG oslo_concurrency.lockutils [None req-97744130-53ff-4fad-9457-80d2c8fdb1f2 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Acquiring lock "6532f73e-4d35-42af-b257-7ddf3cd08929-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:12:41 compute-1 nova_compute[183403]: 2026-01-26 15:12:41.253 183407 DEBUG oslo_concurrency.lockutils [None req-97744130-53ff-4fad-9457-80d2c8fdb1f2 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "6532f73e-4d35-42af-b257-7ddf3cd08929-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:12:41 compute-1 nova_compute[183403]: 2026-01-26 15:12:41.254 183407 DEBUG oslo_concurrency.lockutils [None req-97744130-53ff-4fad-9457-80d2c8fdb1f2 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "6532f73e-4d35-42af-b257-7ddf3cd08929-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:12:41 compute-1 nova_compute[183403]: 2026-01-26 15:12:41.270 183407 INFO nova.compute.manager [None req-97744130-53ff-4fad-9457-80d2c8fdb1f2 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 6532f73e-4d35-42af-b257-7ddf3cd08929] Terminating instance
Jan 26 15:12:41 compute-1 nova_compute[183403]: 2026-01-26 15:12:41.788 183407 DEBUG nova.compute.manager [None req-97744130-53ff-4fad-9457-80d2c8fdb1f2 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 6532f73e-4d35-42af-b257-7ddf3cd08929] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Jan 26 15:12:41 compute-1 kernel: tap2fe6de1e-43 (unregistering): left promiscuous mode
Jan 26 15:12:41 compute-1 NetworkManager[55716]: <info>  [1769440361.8133] device (tap2fe6de1e-43): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 15:12:41 compute-1 ovn_controller[95641]: 2026-01-26T15:12:41Z|00085|binding|INFO|Releasing lport 2fe6de1e-4369-4693-b470-ed7d9e9f51ef from this chassis (sb_readonly=0)
Jan 26 15:12:41 compute-1 ovn_controller[95641]: 2026-01-26T15:12:41Z|00086|binding|INFO|Setting lport 2fe6de1e-4369-4693-b470-ed7d9e9f51ef down in Southbound
Jan 26 15:12:41 compute-1 nova_compute[183403]: 2026-01-26 15:12:41.824 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:41 compute-1 ovn_controller[95641]: 2026-01-26T15:12:41Z|00087|binding|INFO|Removing iface tap2fe6de1e-43 ovn-installed in OVS
Jan 26 15:12:41 compute-1 nova_compute[183403]: 2026-01-26 15:12:41.827 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:41 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:41.832 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a4:ff:b6 10.100.0.7'], port_security=['fa:16:3e:a4:ff:b6 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '6532f73e-4d35-42af-b257-7ddf3cd08929', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d4a37c9f-5b64-4f94-80e9-126c911b1acf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6377892a338d4a7cbe63cf30bd2c63ea', 'neutron:revision_number': '14', 'neutron:security_group_ids': '6ec487f2-f407-43f7-8fd3-02f4d5e73158', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=922d1c2a-bc46-47ee-81d5-242719303ef7, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], logical_port=2fe6de1e-4369-4693-b470-ed7d9e9f51ef) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:12:41 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:41.833 104930 INFO neutron.agent.ovn.metadata.agent [-] Port 2fe6de1e-4369-4693-b470-ed7d9e9f51ef in datapath d4a37c9f-5b64-4f94-80e9-126c911b1acf unbound from our chassis
Jan 26 15:12:41 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:41.840 104930 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d4a37c9f-5b64-4f94-80e9-126c911b1acf
Jan 26 15:12:41 compute-1 nova_compute[183403]: 2026-01-26 15:12:41.848 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:41 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:41.863 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[684293e5-8342-415d-a315-90f5c1b432dd]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:12:41 compute-1 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000006.scope: Deactivated successfully.
Jan 26 15:12:41 compute-1 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000006.scope: Consumed 4.086s CPU time.
Jan 26 15:12:41 compute-1 systemd-machined[154697]: Machine qemu-6-instance-00000006 terminated.
Jan 26 15:12:41 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:41.901 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[7ceb455d-dc8d-4c29-8e7a-0ed9d81052ca]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:12:41 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:41.904 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[8cc7a9c1-8fce-4c5d-a981-1f21cf22cf47]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:12:41 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:41.938 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[03565982-5014-4c4e-b8c2-8a18e17a3181]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:12:41 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:41.960 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[d23e279f-7754-437f-889a-191a1aa72819]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd4a37c9f-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:53:55:45'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 36, 'tx_packets': 21, 'rx_bytes': 1792, 'tx_bytes': 1026, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 36, 'tx_packets': 21, 'rx_bytes': 1792, 'tx_bytes': 1026, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 385314, 'reachable_time': 27856, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 206869, 'error': None, 'target': 'ovnmeta-d4a37c9f-5b64-4f94-80e9-126c911b1acf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:12:41 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:41.981 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[27508569-4ba6-4c97-9d46-5c9e74b41669]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd4a37c9f-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 385324, 'tstamp': 385324}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 206870, 'error': None, 'target': 'ovnmeta-d4a37c9f-5b64-4f94-80e9-126c911b1acf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd4a37c9f-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 385327, 'tstamp': 385327}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 206870, 'error': None, 'target': 'ovnmeta-d4a37c9f-5b64-4f94-80e9-126c911b1acf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:12:41 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:41.983 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd4a37c9f-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:12:41 compute-1 nova_compute[183403]: 2026-01-26 15:12:41.985 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:41 compute-1 nova_compute[183403]: 2026-01-26 15:12:41.988 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:41 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:41.989 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd4a37c9f-50, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:12:41 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:41.989 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:12:41 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:41.989 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd4a37c9f-50, col_values=(('external_ids', {'iface-id': '3415b7f1-5b64-48d1-b20f-4c68422efc0e'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:12:41 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:41.990 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:12:41 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:41.991 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[55dbb9a4-32a1-4bb3-9dcb-1a6663f246c0]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-d4a37c9f-5b64-4f94-80e9-126c911b1acf\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/d4a37c9f-5b64-4f94-80e9-126c911b1acf.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID d4a37c9f-5b64-4f94-80e9-126c911b1acf\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:12:42 compute-1 nova_compute[183403]: 2026-01-26 15:12:42.055 183407 INFO nova.virt.libvirt.driver [-] [instance: 6532f73e-4d35-42af-b257-7ddf3cd08929] Instance destroyed successfully.
Jan 26 15:12:42 compute-1 nova_compute[183403]: 2026-01-26 15:12:42.056 183407 DEBUG nova.objects.instance [None req-97744130-53ff-4fad-9457-80d2c8fdb1f2 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lazy-loading 'resources' on Instance uuid 6532f73e-4d35-42af-b257-7ddf3cd08929 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:12:42 compute-1 sshd-session[206856]: Invalid user tomcat from 185.246.128.170 port 7970
Jan 26 15:12:42 compute-1 nova_compute[183403]: 2026-01-26 15:12:42.241 183407 DEBUG nova.compute.manager [req-644833d6-cb32-4d2f-8a08-7e55f0137c59 req-ce220b0d-81e1-48e7-ac9a-75ae43766328 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 6532f73e-4d35-42af-b257-7ddf3cd08929] Received event network-vif-unplugged-2fe6de1e-4369-4693-b470-ed7d9e9f51ef external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:12:42 compute-1 nova_compute[183403]: 2026-01-26 15:12:42.241 183407 DEBUG oslo_concurrency.lockutils [req-644833d6-cb32-4d2f-8a08-7e55f0137c59 req-ce220b0d-81e1-48e7-ac9a-75ae43766328 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "6532f73e-4d35-42af-b257-7ddf3cd08929-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:12:42 compute-1 nova_compute[183403]: 2026-01-26 15:12:42.241 183407 DEBUG oslo_concurrency.lockutils [req-644833d6-cb32-4d2f-8a08-7e55f0137c59 req-ce220b0d-81e1-48e7-ac9a-75ae43766328 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "6532f73e-4d35-42af-b257-7ddf3cd08929-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:12:42 compute-1 nova_compute[183403]: 2026-01-26 15:12:42.241 183407 DEBUG oslo_concurrency.lockutils [req-644833d6-cb32-4d2f-8a08-7e55f0137c59 req-ce220b0d-81e1-48e7-ac9a-75ae43766328 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "6532f73e-4d35-42af-b257-7ddf3cd08929-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:12:42 compute-1 nova_compute[183403]: 2026-01-26 15:12:42.242 183407 DEBUG nova.compute.manager [req-644833d6-cb32-4d2f-8a08-7e55f0137c59 req-ce220b0d-81e1-48e7-ac9a-75ae43766328 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 6532f73e-4d35-42af-b257-7ddf3cd08929] No waiting events found dispatching network-vif-unplugged-2fe6de1e-4369-4693-b470-ed7d9e9f51ef pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:12:42 compute-1 nova_compute[183403]: 2026-01-26 15:12:42.242 183407 DEBUG nova.compute.manager [req-644833d6-cb32-4d2f-8a08-7e55f0137c59 req-ce220b0d-81e1-48e7-ac9a-75ae43766328 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 6532f73e-4d35-42af-b257-7ddf3cd08929] Received event network-vif-unplugged-2fe6de1e-4369-4693-b470-ed7d9e9f51ef for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 15:12:42 compute-1 nova_compute[183403]: 2026-01-26 15:12:42.562 183407 DEBUG nova.virt.libvirt.vif [None req-97744130-53ff-4fad-9457-80d2c8fdb1f2 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2026-01-26T15:09:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-367519052',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-367519052',id=6,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T15:09:52Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6377892a338d4a7cbe63cf30bd2c63ea',ramdisk_id='',reservation_id='r-jy1lc3dj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,reader,member,admin',clean_attempts='1',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-280856547',owner_user_name='tempest-TestExecuteActionsViaActuator-280856547-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T15:11:44Z,user_data=None,user_id='afb4f4811cb043dca89a8413c390ba3d',uuid=6532f73e-4d35-42af-b257-7ddf3cd08929,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2fe6de1e-4369-4693-b470-ed7d9e9f51ef", "address": "fa:16:3e:a4:ff:b6", "network": {"id": "d4a37c9f-5b64-4f94-80e9-126c911b1acf", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7dc5e9070a084dfcb543a08e87868f39", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fe6de1e-43", "ovs_interfaceid": "2fe6de1e-4369-4693-b470-ed7d9e9f51ef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 26 15:12:42 compute-1 nova_compute[183403]: 2026-01-26 15:12:42.563 183407 DEBUG nova.network.os_vif_util [None req-97744130-53ff-4fad-9457-80d2c8fdb1f2 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Converting VIF {"id": "2fe6de1e-4369-4693-b470-ed7d9e9f51ef", "address": "fa:16:3e:a4:ff:b6", "network": {"id": "d4a37c9f-5b64-4f94-80e9-126c911b1acf", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7dc5e9070a084dfcb543a08e87868f39", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fe6de1e-43", "ovs_interfaceid": "2fe6de1e-4369-4693-b470-ed7d9e9f51ef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:12:42 compute-1 nova_compute[183403]: 2026-01-26 15:12:42.564 183407 DEBUG nova.network.os_vif_util [None req-97744130-53ff-4fad-9457-80d2c8fdb1f2 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a4:ff:b6,bridge_name='br-int',has_traffic_filtering=True,id=2fe6de1e-4369-4693-b470-ed7d9e9f51ef,network=Network(d4a37c9f-5b64-4f94-80e9-126c911b1acf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2fe6de1e-43') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:12:42 compute-1 nova_compute[183403]: 2026-01-26 15:12:42.564 183407 DEBUG os_vif [None req-97744130-53ff-4fad-9457-80d2c8fdb1f2 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a4:ff:b6,bridge_name='br-int',has_traffic_filtering=True,id=2fe6de1e-4369-4693-b470-ed7d9e9f51ef,network=Network(d4a37c9f-5b64-4f94-80e9-126c911b1acf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2fe6de1e-43') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 26 15:12:42 compute-1 nova_compute[183403]: 2026-01-26 15:12:42.567 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:42 compute-1 nova_compute[183403]: 2026-01-26 15:12:42.567 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2fe6de1e-43, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:12:42 compute-1 nova_compute[183403]: 2026-01-26 15:12:42.569 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:42 compute-1 nova_compute[183403]: 2026-01-26 15:12:42.570 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:42 compute-1 nova_compute[183403]: 2026-01-26 15:12:42.571 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:42 compute-1 nova_compute[183403]: 2026-01-26 15:12:42.572 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=3320c8e3-eb3c-4a8b-b1b8-8812d8af6275) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:12:42 compute-1 nova_compute[183403]: 2026-01-26 15:12:42.573 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:42 compute-1 nova_compute[183403]: 2026-01-26 15:12:42.574 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:42 compute-1 nova_compute[183403]: 2026-01-26 15:12:42.577 183407 INFO os_vif [None req-97744130-53ff-4fad-9457-80d2c8fdb1f2 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a4:ff:b6,bridge_name='br-int',has_traffic_filtering=True,id=2fe6de1e-4369-4693-b470-ed7d9e9f51ef,network=Network(d4a37c9f-5b64-4f94-80e9-126c911b1acf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2fe6de1e-43')
Jan 26 15:12:42 compute-1 nova_compute[183403]: 2026-01-26 15:12:42.578 183407 INFO nova.virt.libvirt.driver [None req-97744130-53ff-4fad-9457-80d2c8fdb1f2 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 6532f73e-4d35-42af-b257-7ddf3cd08929] Deleting instance files /var/lib/nova/instances/6532f73e-4d35-42af-b257-7ddf3cd08929_del
Jan 26 15:12:42 compute-1 nova_compute[183403]: 2026-01-26 15:12:42.579 183407 INFO nova.virt.libvirt.driver [None req-97744130-53ff-4fad-9457-80d2c8fdb1f2 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 6532f73e-4d35-42af-b257-7ddf3cd08929] Deletion of /var/lib/nova/instances/6532f73e-4d35-42af-b257-7ddf3cd08929_del complete
Jan 26 15:12:43 compute-1 nova_compute[183403]: 2026-01-26 15:12:43.094 183407 INFO nova.compute.manager [None req-97744130-53ff-4fad-9457-80d2c8fdb1f2 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 6532f73e-4d35-42af-b257-7ddf3cd08929] Took 1.31 seconds to destroy the instance on the hypervisor.
Jan 26 15:12:43 compute-1 nova_compute[183403]: 2026-01-26 15:12:43.096 183407 DEBUG oslo.service.backend._eventlet.loopingcall [None req-97744130-53ff-4fad-9457-80d2c8fdb1f2 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Jan 26 15:12:43 compute-1 nova_compute[183403]: 2026-01-26 15:12:43.097 183407 DEBUG nova.compute.manager [-] [instance: 6532f73e-4d35-42af-b257-7ddf3cd08929] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Jan 26 15:12:43 compute-1 nova_compute[183403]: 2026-01-26 15:12:43.097 183407 DEBUG nova.network.neutron [-] [instance: 6532f73e-4d35-42af-b257-7ddf3cd08929] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Jan 26 15:12:43 compute-1 nova_compute[183403]: 2026-01-26 15:12:43.098 183407 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:12:44 compute-1 nova_compute[183403]: 2026-01-26 15:12:44.103 183407 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:12:44 compute-1 nova_compute[183403]: 2026-01-26 15:12:44.290 183407 DEBUG nova.compute.manager [req-ce0c48bb-24c4-4514-9ac5-7046eb0b4b91 req-4c5d4798-03ca-44d9-a247-7e81fffd5eb7 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 6532f73e-4d35-42af-b257-7ddf3cd08929] Received event network-vif-unplugged-2fe6de1e-4369-4693-b470-ed7d9e9f51ef external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:12:44 compute-1 nova_compute[183403]: 2026-01-26 15:12:44.291 183407 DEBUG oslo_concurrency.lockutils [req-ce0c48bb-24c4-4514-9ac5-7046eb0b4b91 req-4c5d4798-03ca-44d9-a247-7e81fffd5eb7 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "6532f73e-4d35-42af-b257-7ddf3cd08929-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:12:44 compute-1 nova_compute[183403]: 2026-01-26 15:12:44.291 183407 DEBUG oslo_concurrency.lockutils [req-ce0c48bb-24c4-4514-9ac5-7046eb0b4b91 req-4c5d4798-03ca-44d9-a247-7e81fffd5eb7 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "6532f73e-4d35-42af-b257-7ddf3cd08929-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:12:44 compute-1 nova_compute[183403]: 2026-01-26 15:12:44.291 183407 DEBUG oslo_concurrency.lockutils [req-ce0c48bb-24c4-4514-9ac5-7046eb0b4b91 req-4c5d4798-03ca-44d9-a247-7e81fffd5eb7 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "6532f73e-4d35-42af-b257-7ddf3cd08929-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:12:44 compute-1 nova_compute[183403]: 2026-01-26 15:12:44.291 183407 DEBUG nova.compute.manager [req-ce0c48bb-24c4-4514-9ac5-7046eb0b4b91 req-4c5d4798-03ca-44d9-a247-7e81fffd5eb7 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 6532f73e-4d35-42af-b257-7ddf3cd08929] No waiting events found dispatching network-vif-unplugged-2fe6de1e-4369-4693-b470-ed7d9e9f51ef pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:12:44 compute-1 nova_compute[183403]: 2026-01-26 15:12:44.292 183407 DEBUG nova.compute.manager [req-ce0c48bb-24c4-4514-9ac5-7046eb0b4b91 req-4c5d4798-03ca-44d9-a247-7e81fffd5eb7 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 6532f73e-4d35-42af-b257-7ddf3cd08929] Received event network-vif-unplugged-2fe6de1e-4369-4693-b470-ed7d9e9f51ef for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 15:12:44 compute-1 nova_compute[183403]: 2026-01-26 15:12:44.530 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:44 compute-1 sshd-session[206856]: Disconnecting invalid user tomcat 185.246.128.170 port 7970: Change of username or service not allowed: (tomcat,ssh-connection) -> (nginx,ssh-connection) [preauth]
Jan 26 15:12:46 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:46.118 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:ac:38', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'de:96:40:90:f3:e5'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:12:46 compute-1 nova_compute[183403]: 2026-01-26 15:12:46.118 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:46 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:46.118 104930 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 15:12:46 compute-1 nova_compute[183403]: 2026-01-26 15:12:46.338 183407 DEBUG nova.compute.manager [req-0e0fd8a2-122f-4ec0-92be-070b1543acf6 req-e9156021-f1e0-43e8-9504-4a5f24a29ce1 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 6532f73e-4d35-42af-b257-7ddf3cd08929] Received event network-vif-deleted-2fe6de1e-4369-4693-b470-ed7d9e9f51ef external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:12:46 compute-1 nova_compute[183403]: 2026-01-26 15:12:46.339 183407 INFO nova.compute.manager [req-0e0fd8a2-122f-4ec0-92be-070b1543acf6 req-e9156021-f1e0-43e8-9504-4a5f24a29ce1 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 6532f73e-4d35-42af-b257-7ddf3cd08929] Neutron deleted interface 2fe6de1e-4369-4693-b470-ed7d9e9f51ef; detaching it from the instance and deleting it from the info cache
Jan 26 15:12:46 compute-1 nova_compute[183403]: 2026-01-26 15:12:46.339 183407 DEBUG nova.network.neutron [req-0e0fd8a2-122f-4ec0-92be-070b1543acf6 req-e9156021-f1e0-43e8-9504-4a5f24a29ce1 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 6532f73e-4d35-42af-b257-7ddf3cd08929] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:12:46 compute-1 nova_compute[183403]: 2026-01-26 15:12:46.731 183407 DEBUG nova.network.neutron [-] [instance: 6532f73e-4d35-42af-b257-7ddf3cd08929] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:12:46 compute-1 sshd-session[206889]: Invalid user nginx from 185.246.128.170 port 54670
Jan 26 15:12:46 compute-1 nova_compute[183403]: 2026-01-26 15:12:46.851 183407 DEBUG nova.compute.manager [req-0e0fd8a2-122f-4ec0-92be-070b1543acf6 req-e9156021-f1e0-43e8-9504-4a5f24a29ce1 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 6532f73e-4d35-42af-b257-7ddf3cd08929] Detach interface failed, port_id=2fe6de1e-4369-4693-b470-ed7d9e9f51ef, reason: Instance 6532f73e-4d35-42af-b257-7ddf3cd08929 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Jan 26 15:12:47 compute-1 nova_compute[183403]: 2026-01-26 15:12:47.238 183407 INFO nova.compute.manager [-] [instance: 6532f73e-4d35-42af-b257-7ddf3cd08929] Took 4.14 seconds to deallocate network for instance.
Jan 26 15:12:47 compute-1 nova_compute[183403]: 2026-01-26 15:12:47.573 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:47 compute-1 nova_compute[183403]: 2026-01-26 15:12:47.764 183407 DEBUG oslo_concurrency.lockutils [None req-97744130-53ff-4fad-9457-80d2c8fdb1f2 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:12:47 compute-1 nova_compute[183403]: 2026-01-26 15:12:47.765 183407 DEBUG oslo_concurrency.lockutils [None req-97744130-53ff-4fad-9457-80d2c8fdb1f2 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:12:47 compute-1 nova_compute[183403]: 2026-01-26 15:12:47.856 183407 DEBUG nova.compute.provider_tree [None req-97744130-53ff-4fad-9457-80d2c8fdb1f2 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:12:47 compute-1 sshd-session[206889]: Disconnecting invalid user nginx 185.246.128.170 port 54670: Change of username or service not allowed: (nginx,ssh-connection) -> (redmine,ssh-connection) [preauth]
Jan 26 15:12:48 compute-1 nova_compute[183403]: 2026-01-26 15:12:48.365 183407 DEBUG nova.scheduler.client.report [None req-97744130-53ff-4fad-9457-80d2c8fdb1f2 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:12:48 compute-1 nova_compute[183403]: 2026-01-26 15:12:48.881 183407 DEBUG oslo_concurrency.lockutils [None req-97744130-53ff-4fad-9457-80d2c8fdb1f2 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.116s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:12:48 compute-1 nova_compute[183403]: 2026-01-26 15:12:48.944 183407 INFO nova.scheduler.client.report [None req-97744130-53ff-4fad-9457-80d2c8fdb1f2 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Deleted allocations for instance 6532f73e-4d35-42af-b257-7ddf3cd08929
Jan 26 15:12:49 compute-1 openstack_network_exporter[195610]: ERROR   15:12:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:12:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:12:49 compute-1 openstack_network_exporter[195610]: ERROR   15:12:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:12:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:12:49 compute-1 nova_compute[183403]: 2026-01-26 15:12:49.532 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:49 compute-1 nova_compute[183403]: 2026-01-26 15:12:49.975 183407 DEBUG oslo_concurrency.lockutils [None req-97744130-53ff-4fad-9457-80d2c8fdb1f2 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "6532f73e-4d35-42af-b257-7ddf3cd08929" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 8.723s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:12:50 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:50.121 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=41380b5a-e321-4ce4-bcc6-ecd563b3c793, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:12:52 compute-1 nova_compute[183403]: 2026-01-26 15:12:52.576 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:52 compute-1 nova_compute[183403]: 2026-01-26 15:12:52.797 183407 DEBUG oslo_concurrency.lockutils [None req-06990938-0697-4359-8bd5-730faeb8071d afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Acquiring lock "66a7af21-1abe-467f-b739-441e05a4b09a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:12:52 compute-1 nova_compute[183403]: 2026-01-26 15:12:52.799 183407 DEBUG oslo_concurrency.lockutils [None req-06990938-0697-4359-8bd5-730faeb8071d afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "66a7af21-1abe-467f-b739-441e05a4b09a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:12:52 compute-1 nova_compute[183403]: 2026-01-26 15:12:52.799 183407 DEBUG oslo_concurrency.lockutils [None req-06990938-0697-4359-8bd5-730faeb8071d afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Acquiring lock "66a7af21-1abe-467f-b739-441e05a4b09a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:12:52 compute-1 nova_compute[183403]: 2026-01-26 15:12:52.799 183407 DEBUG oslo_concurrency.lockutils [None req-06990938-0697-4359-8bd5-730faeb8071d afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "66a7af21-1abe-467f-b739-441e05a4b09a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:12:52 compute-1 nova_compute[183403]: 2026-01-26 15:12:52.800 183407 DEBUG oslo_concurrency.lockutils [None req-06990938-0697-4359-8bd5-730faeb8071d afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "66a7af21-1abe-467f-b739-441e05a4b09a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:12:52 compute-1 nova_compute[183403]: 2026-01-26 15:12:52.811 183407 INFO nova.compute.manager [None req-06990938-0697-4359-8bd5-730faeb8071d afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Terminating instance
Jan 26 15:12:52 compute-1 podman[206894]: 2026-01-26 15:12:52.911002386 +0000 UTC m=+0.073103357 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 15:12:52 compute-1 podman[206895]: 2026-01-26 15:12:52.913575243 +0000 UTC m=+0.071517104 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_id=openstack_network_exporter, io.buildah.version=1.33.7, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, vcs-type=git, version=9.6, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41)
Jan 26 15:12:53 compute-1 nova_compute[183403]: 2026-01-26 15:12:53.323 183407 DEBUG nova.compute.manager [None req-06990938-0697-4359-8bd5-730faeb8071d afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Jan 26 15:12:53 compute-1 kernel: tap1fd7a551-45 (unregistering): left promiscuous mode
Jan 26 15:12:53 compute-1 NetworkManager[55716]: <info>  [1769440373.7009] device (tap1fd7a551-45): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 15:12:53 compute-1 ovn_controller[95641]: 2026-01-26T15:12:53Z|00088|binding|INFO|Releasing lport 1fd7a551-45a6-412c-abb2-e2d57c2b25e8 from this chassis (sb_readonly=0)
Jan 26 15:12:53 compute-1 ovn_controller[95641]: 2026-01-26T15:12:53Z|00089|binding|INFO|Setting lport 1fd7a551-45a6-412c-abb2-e2d57c2b25e8 down in Southbound
Jan 26 15:12:53 compute-1 nova_compute[183403]: 2026-01-26 15:12:53.711 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:53 compute-1 ovn_controller[95641]: 2026-01-26T15:12:53Z|00090|binding|INFO|Removing iface tap1fd7a551-45 ovn-installed in OVS
Jan 26 15:12:53 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:53.723 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:da:59:eb 10.100.0.9'], port_security=['fa:16:3e:da:59:eb 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '66a7af21-1abe-467f-b739-441e05a4b09a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d4a37c9f-5b64-4f94-80e9-126c911b1acf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6377892a338d4a7cbe63cf30bd2c63ea', 'neutron:revision_number': '5', 'neutron:security_group_ids': '6ec487f2-f407-43f7-8fd3-02f4d5e73158', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=922d1c2a-bc46-47ee-81d5-242719303ef7, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], logical_port=1fd7a551-45a6-412c-abb2-e2d57c2b25e8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:12:53 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:53.725 104930 INFO neutron.agent.ovn.metadata.agent [-] Port 1fd7a551-45a6-412c-abb2-e2d57c2b25e8 in datapath d4a37c9f-5b64-4f94-80e9-126c911b1acf unbound from our chassis
Jan 26 15:12:53 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:53.729 104930 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d4a37c9f-5b64-4f94-80e9-126c911b1acf
Jan 26 15:12:53 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:53.756 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[efad0ffa-c9e0-43c6-b47f-2121e0f79ae6]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:12:53 compute-1 nova_compute[183403]: 2026-01-26 15:12:53.770 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:53 compute-1 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000005.scope: Deactivated successfully.
Jan 26 15:12:53 compute-1 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000005.scope: Consumed 23.471s CPU time.
Jan 26 15:12:53 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:53.800 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[4e934866-7138-4f17-9b77-929263d29afd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:12:53 compute-1 systemd-machined[154697]: Machine qemu-2-instance-00000005 terminated.
Jan 26 15:12:53 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:53.804 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[3c523809-f729-46c5-99ed-916031e3ba1c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:12:53 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:53.836 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[2ac4b818-d2d7-4e81-9e6e-c30e4336e9ed]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:12:53 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:53.860 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[aea5d8b9-810e-45ad-b0e7-28ef2b05ba5b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd4a37c9f-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:53:55:45'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 36, 'tx_packets': 23, 'rx_bytes': 1792, 'tx_bytes': 1110, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 36, 'tx_packets': 23, 'rx_bytes': 1792, 'tx_bytes': 1110, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 385314, 'reachable_time': 27856, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 206948, 'error': None, 'target': 'ovnmeta-d4a37c9f-5b64-4f94-80e9-126c911b1acf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:12:53 compute-1 nova_compute[183403]: 2026-01-26 15:12:53.878 183407 DEBUG nova.compute.manager [req-4c337f65-76ee-40bd-ab05-af577a5c531c req-77669cfc-1fd8-4066-835d-b74265b10ba0 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Received event network-vif-unplugged-1fd7a551-45a6-412c-abb2-e2d57c2b25e8 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:12:53 compute-1 nova_compute[183403]: 2026-01-26 15:12:53.879 183407 DEBUG oslo_concurrency.lockutils [req-4c337f65-76ee-40bd-ab05-af577a5c531c req-77669cfc-1fd8-4066-835d-b74265b10ba0 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "66a7af21-1abe-467f-b739-441e05a4b09a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:12:53 compute-1 nova_compute[183403]: 2026-01-26 15:12:53.879 183407 DEBUG oslo_concurrency.lockutils [req-4c337f65-76ee-40bd-ab05-af577a5c531c req-77669cfc-1fd8-4066-835d-b74265b10ba0 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "66a7af21-1abe-467f-b739-441e05a4b09a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:12:53 compute-1 nova_compute[183403]: 2026-01-26 15:12:53.879 183407 DEBUG oslo_concurrency.lockutils [req-4c337f65-76ee-40bd-ab05-af577a5c531c req-77669cfc-1fd8-4066-835d-b74265b10ba0 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "66a7af21-1abe-467f-b739-441e05a4b09a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:12:53 compute-1 nova_compute[183403]: 2026-01-26 15:12:53.879 183407 DEBUG nova.compute.manager [req-4c337f65-76ee-40bd-ab05-af577a5c531c req-77669cfc-1fd8-4066-835d-b74265b10ba0 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] No waiting events found dispatching network-vif-unplugged-1fd7a551-45a6-412c-abb2-e2d57c2b25e8 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:12:53 compute-1 nova_compute[183403]: 2026-01-26 15:12:53.880 183407 DEBUG nova.compute.manager [req-4c337f65-76ee-40bd-ab05-af577a5c531c req-77669cfc-1fd8-4066-835d-b74265b10ba0 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Received event network-vif-unplugged-1fd7a551-45a6-412c-abb2-e2d57c2b25e8 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 15:12:53 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:53.883 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[5ff8c26d-9bd2-44fd-a001-e6144befb80e]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd4a37c9f-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 385324, 'tstamp': 385324}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 206949, 'error': None, 'target': 'ovnmeta-d4a37c9f-5b64-4f94-80e9-126c911b1acf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd4a37c9f-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 385327, 'tstamp': 385327}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 206949, 'error': None, 'target': 'ovnmeta-d4a37c9f-5b64-4f94-80e9-126c911b1acf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:12:53 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:53.885 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd4a37c9f-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:12:53 compute-1 nova_compute[183403]: 2026-01-26 15:12:53.887 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:53 compute-1 nova_compute[183403]: 2026-01-26 15:12:53.893 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:53 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:53.894 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd4a37c9f-50, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:12:53 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:53.894 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:12:53 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:53.894 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd4a37c9f-50, col_values=(('external_ids', {'iface-id': '3415b7f1-5b64-48d1-b20f-4c68422efc0e'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:12:53 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:53.895 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:12:53 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:12:53.896 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[df50f073-1836-49da-8096-bf9f0e6fdba6]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-d4a37c9f-5b64-4f94-80e9-126c911b1acf\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/d4a37c9f-5b64-4f94-80e9-126c911b1acf.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID d4a37c9f-5b64-4f94-80e9-126c911b1acf\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:12:53 compute-1 nova_compute[183403]: 2026-01-26 15:12:53.948 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:53 compute-1 nova_compute[183403]: 2026-01-26 15:12:53.953 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:53 compute-1 nova_compute[183403]: 2026-01-26 15:12:53.991 183407 INFO nova.virt.libvirt.driver [-] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Instance destroyed successfully.
Jan 26 15:12:53 compute-1 nova_compute[183403]: 2026-01-26 15:12:53.992 183407 DEBUG nova.objects.instance [None req-06990938-0697-4359-8bd5-730faeb8071d afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lazy-loading 'resources' on Instance uuid 66a7af21-1abe-467f-b739-441e05a4b09a obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:12:54 compute-1 nova_compute[183403]: 2026-01-26 15:12:54.498 183407 DEBUG nova.virt.libvirt.vif [None req-06990938-0697-4359-8bd5-730faeb8071d afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2026-01-26T15:08:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-804615053',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-804615053',id=5,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T15:08:52Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6377892a338d4a7cbe63cf30bd2c63ea',ramdisk_id='',reservation_id='r-hxr43d02',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,reader,member,admin',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-280856547',owner_user_name='tempest-TestExecuteActionsViaActuator-280856547-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T15:08:53Z,user_data=None,user_id='afb4f4811cb043dca89a8413c390ba3d',uuid=66a7af21-1abe-467f-b739-441e05a4b09a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1fd7a551-45a6-412c-abb2-e2d57c2b25e8", "address": "fa:16:3e:da:59:eb", "network": {"id": "d4a37c9f-5b64-4f94-80e9-126c911b1acf", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7dc5e9070a084dfcb543a08e87868f39", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1fd7a551-45", "ovs_interfaceid": "1fd7a551-45a6-412c-abb2-e2d57c2b25e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 26 15:12:54 compute-1 nova_compute[183403]: 2026-01-26 15:12:54.499 183407 DEBUG nova.network.os_vif_util [None req-06990938-0697-4359-8bd5-730faeb8071d afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Converting VIF {"id": "1fd7a551-45a6-412c-abb2-e2d57c2b25e8", "address": "fa:16:3e:da:59:eb", "network": {"id": "d4a37c9f-5b64-4f94-80e9-126c911b1acf", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7dc5e9070a084dfcb543a08e87868f39", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1fd7a551-45", "ovs_interfaceid": "1fd7a551-45a6-412c-abb2-e2d57c2b25e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:12:54 compute-1 nova_compute[183403]: 2026-01-26 15:12:54.500 183407 DEBUG nova.network.os_vif_util [None req-06990938-0697-4359-8bd5-730faeb8071d afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:da:59:eb,bridge_name='br-int',has_traffic_filtering=True,id=1fd7a551-45a6-412c-abb2-e2d57c2b25e8,network=Network(d4a37c9f-5b64-4f94-80e9-126c911b1acf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1fd7a551-45') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:12:54 compute-1 nova_compute[183403]: 2026-01-26 15:12:54.500 183407 DEBUG os_vif [None req-06990938-0697-4359-8bd5-730faeb8071d afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:da:59:eb,bridge_name='br-int',has_traffic_filtering=True,id=1fd7a551-45a6-412c-abb2-e2d57c2b25e8,network=Network(d4a37c9f-5b64-4f94-80e9-126c911b1acf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1fd7a551-45') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 26 15:12:54 compute-1 nova_compute[183403]: 2026-01-26 15:12:54.502 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:54 compute-1 nova_compute[183403]: 2026-01-26 15:12:54.503 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1fd7a551-45, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:12:54 compute-1 nova_compute[183403]: 2026-01-26 15:12:54.505 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:54 compute-1 nova_compute[183403]: 2026-01-26 15:12:54.507 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:54 compute-1 nova_compute[183403]: 2026-01-26 15:12:54.509 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:54 compute-1 nova_compute[183403]: 2026-01-26 15:12:54.509 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=715b2da4-57f5-4aa8-962d-81f147f979ad) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:12:54 compute-1 nova_compute[183403]: 2026-01-26 15:12:54.511 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:54 compute-1 nova_compute[183403]: 2026-01-26 15:12:54.512 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:54 compute-1 nova_compute[183403]: 2026-01-26 15:12:54.515 183407 INFO os_vif [None req-06990938-0697-4359-8bd5-730faeb8071d afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:da:59:eb,bridge_name='br-int',has_traffic_filtering=True,id=1fd7a551-45a6-412c-abb2-e2d57c2b25e8,network=Network(d4a37c9f-5b64-4f94-80e9-126c911b1acf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1fd7a551-45')
Jan 26 15:12:54 compute-1 nova_compute[183403]: 2026-01-26 15:12:54.516 183407 INFO nova.virt.libvirt.driver [None req-06990938-0697-4359-8bd5-730faeb8071d afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Deleting instance files /var/lib/nova/instances/66a7af21-1abe-467f-b739-441e05a4b09a_del
Jan 26 15:12:54 compute-1 nova_compute[183403]: 2026-01-26 15:12:54.517 183407 INFO nova.virt.libvirt.driver [None req-06990938-0697-4359-8bd5-730faeb8071d afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Deletion of /var/lib/nova/instances/66a7af21-1abe-467f-b739-441e05a4b09a_del complete
Jan 26 15:12:54 compute-1 nova_compute[183403]: 2026-01-26 15:12:54.534 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:55 compute-1 nova_compute[183403]: 2026-01-26 15:12:55.030 183407 INFO nova.compute.manager [None req-06990938-0697-4359-8bd5-730faeb8071d afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Took 1.71 seconds to destroy the instance on the hypervisor.
Jan 26 15:12:55 compute-1 nova_compute[183403]: 2026-01-26 15:12:55.031 183407 DEBUG oslo.service.backend._eventlet.loopingcall [None req-06990938-0697-4359-8bd5-730faeb8071d afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Jan 26 15:12:55 compute-1 nova_compute[183403]: 2026-01-26 15:12:55.031 183407 DEBUG nova.compute.manager [-] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Jan 26 15:12:55 compute-1 nova_compute[183403]: 2026-01-26 15:12:55.032 183407 DEBUG nova.network.neutron [-] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Jan 26 15:12:55 compute-1 nova_compute[183403]: 2026-01-26 15:12:55.032 183407 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:12:55 compute-1 sshd-session[206892]: Invalid user redmine from 185.246.128.170 port 26793
Jan 26 15:12:55 compute-1 nova_compute[183403]: 2026-01-26 15:12:55.140 183407 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:12:55 compute-1 sshd-session[206892]: Disconnecting invalid user redmine 185.246.128.170 port 26793: Change of username or service not allowed: (redmine,ssh-connection) -> (helpdesk,ssh-connection) [preauth]
Jan 26 15:12:55 compute-1 nova_compute[183403]: 2026-01-26 15:12:55.621 183407 DEBUG nova.compute.manager [req-6175afe2-1e6d-4447-9938-f827521fd660 req-74b8c83b-6dc4-40f4-8222-db2fdb50cf68 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Received event network-vif-deleted-1fd7a551-45a6-412c-abb2-e2d57c2b25e8 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:12:55 compute-1 nova_compute[183403]: 2026-01-26 15:12:55.621 183407 INFO nova.compute.manager [req-6175afe2-1e6d-4447-9938-f827521fd660 req-74b8c83b-6dc4-40f4-8222-db2fdb50cf68 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Neutron deleted interface 1fd7a551-45a6-412c-abb2-e2d57c2b25e8; detaching it from the instance and deleting it from the info cache
Jan 26 15:12:55 compute-1 nova_compute[183403]: 2026-01-26 15:12:55.621 183407 DEBUG nova.network.neutron [req-6175afe2-1e6d-4447-9938-f827521fd660 req-74b8c83b-6dc4-40f4-8222-db2fdb50cf68 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:12:55 compute-1 nova_compute[183403]: 2026-01-26 15:12:55.950 183407 DEBUG nova.compute.manager [req-9df0d7ff-51b3-453b-ab0d-2ee65d0b4ee1 req-ba5bf490-d1af-4484-bfe1-f5e882079ac6 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Received event network-vif-unplugged-1fd7a551-45a6-412c-abb2-e2d57c2b25e8 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:12:55 compute-1 nova_compute[183403]: 2026-01-26 15:12:55.951 183407 DEBUG oslo_concurrency.lockutils [req-9df0d7ff-51b3-453b-ab0d-2ee65d0b4ee1 req-ba5bf490-d1af-4484-bfe1-f5e882079ac6 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "66a7af21-1abe-467f-b739-441e05a4b09a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:12:55 compute-1 nova_compute[183403]: 2026-01-26 15:12:55.951 183407 DEBUG oslo_concurrency.lockutils [req-9df0d7ff-51b3-453b-ab0d-2ee65d0b4ee1 req-ba5bf490-d1af-4484-bfe1-f5e882079ac6 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "66a7af21-1abe-467f-b739-441e05a4b09a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:12:55 compute-1 nova_compute[183403]: 2026-01-26 15:12:55.951 183407 DEBUG oslo_concurrency.lockutils [req-9df0d7ff-51b3-453b-ab0d-2ee65d0b4ee1 req-ba5bf490-d1af-4484-bfe1-f5e882079ac6 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "66a7af21-1abe-467f-b739-441e05a4b09a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:12:55 compute-1 nova_compute[183403]: 2026-01-26 15:12:55.951 183407 DEBUG nova.compute.manager [req-9df0d7ff-51b3-453b-ab0d-2ee65d0b4ee1 req-ba5bf490-d1af-4484-bfe1-f5e882079ac6 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] No waiting events found dispatching network-vif-unplugged-1fd7a551-45a6-412c-abb2-e2d57c2b25e8 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:12:55 compute-1 nova_compute[183403]: 2026-01-26 15:12:55.951 183407 DEBUG nova.compute.manager [req-9df0d7ff-51b3-453b-ab0d-2ee65d0b4ee1 req-ba5bf490-d1af-4484-bfe1-f5e882079ac6 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Received event network-vif-unplugged-1fd7a551-45a6-412c-abb2-e2d57c2b25e8 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 15:12:56 compute-1 nova_compute[183403]: 2026-01-26 15:12:56.089 183407 DEBUG nova.network.neutron [-] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:12:56 compute-1 nova_compute[183403]: 2026-01-26 15:12:56.147 183407 DEBUG nova.compute.manager [req-6175afe2-1e6d-4447-9938-f827521fd660 req-74b8c83b-6dc4-40f4-8222-db2fdb50cf68 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Detach interface failed, port_id=1fd7a551-45a6-412c-abb2-e2d57c2b25e8, reason: Instance 66a7af21-1abe-467f-b739-441e05a4b09a could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Jan 26 15:12:56 compute-1 nova_compute[183403]: 2026-01-26 15:12:56.602 183407 INFO nova.compute.manager [-] [instance: 66a7af21-1abe-467f-b739-441e05a4b09a] Took 1.57 seconds to deallocate network for instance.
Jan 26 15:12:57 compute-1 nova_compute[183403]: 2026-01-26 15:12:57.156 183407 DEBUG oslo_concurrency.lockutils [None req-06990938-0697-4359-8bd5-730faeb8071d afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:12:57 compute-1 nova_compute[183403]: 2026-01-26 15:12:57.156 183407 DEBUG oslo_concurrency.lockutils [None req-06990938-0697-4359-8bd5-730faeb8071d afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:12:57 compute-1 nova_compute[183403]: 2026-01-26 15:12:57.247 183407 DEBUG nova.compute.provider_tree [None req-06990938-0697-4359-8bd5-730faeb8071d afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:12:57 compute-1 nova_compute[183403]: 2026-01-26 15:12:57.835 183407 DEBUG nova.scheduler.client.report [None req-06990938-0697-4359-8bd5-730faeb8071d afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:12:58 compute-1 nova_compute[183403]: 2026-01-26 15:12:58.349 183407 DEBUG oslo_concurrency.lockutils [None req-06990938-0697-4359-8bd5-730faeb8071d afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.193s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:12:58 compute-1 sshd-session[206967]: Invalid user helpdesk from 185.246.128.170 port 39800
Jan 26 15:12:58 compute-1 nova_compute[183403]: 2026-01-26 15:12:58.648 183407 INFO nova.scheduler.client.report [None req-06990938-0697-4359-8bd5-730faeb8071d afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Deleted allocations for instance 66a7af21-1abe-467f-b739-441e05a4b09a
Jan 26 15:12:59 compute-1 nova_compute[183403]: 2026-01-26 15:12:59.512 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:59 compute-1 nova_compute[183403]: 2026-01-26 15:12:59.536 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:12:59 compute-1 nova_compute[183403]: 2026-01-26 15:12:59.950 183407 DEBUG oslo_concurrency.lockutils [None req-06990938-0697-4359-8bd5-730faeb8071d afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "66a7af21-1abe-467f-b739-441e05a4b09a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.151s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:13:01 compute-1 nova_compute[183403]: 2026-01-26 15:13:01.575 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:13:01 compute-1 nova_compute[183403]: 2026-01-26 15:13:01.944 183407 DEBUG oslo_concurrency.lockutils [None req-429e8eb9-b26d-47ce-a419-03182ae65d84 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Acquiring lock "8c64a2e0-f723-4adb-84fc-867073a92349" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:13:01 compute-1 nova_compute[183403]: 2026-01-26 15:13:01.945 183407 DEBUG oslo_concurrency.lockutils [None req-429e8eb9-b26d-47ce-a419-03182ae65d84 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "8c64a2e0-f723-4adb-84fc-867073a92349" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:13:01 compute-1 nova_compute[183403]: 2026-01-26 15:13:01.946 183407 DEBUG oslo_concurrency.lockutils [None req-429e8eb9-b26d-47ce-a419-03182ae65d84 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Acquiring lock "8c64a2e0-f723-4adb-84fc-867073a92349-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:13:01 compute-1 nova_compute[183403]: 2026-01-26 15:13:01.946 183407 DEBUG oslo_concurrency.lockutils [None req-429e8eb9-b26d-47ce-a419-03182ae65d84 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "8c64a2e0-f723-4adb-84fc-867073a92349-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:13:01 compute-1 nova_compute[183403]: 2026-01-26 15:13:01.946 183407 DEBUG oslo_concurrency.lockutils [None req-429e8eb9-b26d-47ce-a419-03182ae65d84 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "8c64a2e0-f723-4adb-84fc-867073a92349-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:13:01 compute-1 nova_compute[183403]: 2026-01-26 15:13:01.962 183407 INFO nova.compute.manager [None req-429e8eb9-b26d-47ce-a419-03182ae65d84 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 8c64a2e0-f723-4adb-84fc-867073a92349] Terminating instance
Jan 26 15:13:02 compute-1 sshd-session[206967]: Disconnecting invalid user helpdesk 185.246.128.170 port 39800: Change of username or service not allowed: (helpdesk,ssh-connection) -> (alireza,ssh-connection) [preauth]
Jan 26 15:13:02 compute-1 nova_compute[183403]: 2026-01-26 15:13:02.481 183407 DEBUG nova.compute.manager [None req-429e8eb9-b26d-47ce-a419-03182ae65d84 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 8c64a2e0-f723-4adb-84fc-867073a92349] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Jan 26 15:13:02 compute-1 kernel: tap6dd62b2f-19 (unregistering): left promiscuous mode
Jan 26 15:13:02 compute-1 NetworkManager[55716]: <info>  [1769440382.5101] device (tap6dd62b2f-19): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 15:13:02 compute-1 ovn_controller[95641]: 2026-01-26T15:13:02Z|00091|binding|INFO|Releasing lport 6dd62b2f-1957-4fa5-92d8-6a7d131f0d09 from this chassis (sb_readonly=0)
Jan 26 15:13:02 compute-1 nova_compute[183403]: 2026-01-26 15:13:02.516 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:13:02 compute-1 ovn_controller[95641]: 2026-01-26T15:13:02Z|00092|binding|INFO|Setting lport 6dd62b2f-1957-4fa5-92d8-6a7d131f0d09 down in Southbound
Jan 26 15:13:02 compute-1 ovn_controller[95641]: 2026-01-26T15:13:02Z|00093|binding|INFO|Removing iface tap6dd62b2f-19 ovn-installed in OVS
Jan 26 15:13:02 compute-1 nova_compute[183403]: 2026-01-26 15:13:02.519 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:13:02 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:13:02.531 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:91:f3:28 10.100.0.8'], port_security=['fa:16:3e:91:f3:28 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '8c64a2e0-f723-4adb-84fc-867073a92349', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d4a37c9f-5b64-4f94-80e9-126c911b1acf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6377892a338d4a7cbe63cf30bd2c63ea', 'neutron:revision_number': '10', 'neutron:security_group_ids': '6ec487f2-f407-43f7-8fd3-02f4d5e73158', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=922d1c2a-bc46-47ee-81d5-242719303ef7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], logical_port=6dd62b2f-1957-4fa5-92d8-6a7d131f0d09) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:13:02 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:13:02.532 104930 INFO neutron.agent.ovn.metadata.agent [-] Port 6dd62b2f-1957-4fa5-92d8-6a7d131f0d09 in datapath d4a37c9f-5b64-4f94-80e9-126c911b1acf unbound from our chassis
Jan 26 15:13:02 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:13:02.534 104930 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d4a37c9f-5b64-4f94-80e9-126c911b1acf, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 15:13:02 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:13:02.535 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[214ebe43-3ab2-46d9-93c1-8e95f6c23194]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:13:02 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:13:02.535 104930 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d4a37c9f-5b64-4f94-80e9-126c911b1acf namespace which is not needed anymore
Jan 26 15:13:02 compute-1 nova_compute[183403]: 2026-01-26 15:13:02.535 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:13:02 compute-1 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000004.scope: Deactivated successfully.
Jan 26 15:13:02 compute-1 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000004.scope: Consumed 23.236s CPU time.
Jan 26 15:13:02 compute-1 systemd-machined[154697]: Machine qemu-3-instance-00000004 terminated.
Jan 26 15:13:02 compute-1 nova_compute[183403]: 2026-01-26 15:13:02.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:13:02 compute-1 podman[206976]: 2026-01-26 15:13:02.654249603 +0000 UTC m=+0.099015537 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Jan 26 15:13:02 compute-1 podman[206973]: 2026-01-26 15:13:02.663702534 +0000 UTC m=+0.108785725 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260120, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible)
Jan 26 15:13:02 compute-1 neutron-haproxy-ovnmeta-d4a37c9f-5b64-4f94-80e9-126c911b1acf[205425]: [NOTICE]   (205429) : haproxy version is 3.0.5-8e879a5
Jan 26 15:13:02 compute-1 neutron-haproxy-ovnmeta-d4a37c9f-5b64-4f94-80e9-126c911b1acf[205425]: [NOTICE]   (205429) : path to executable is /usr/sbin/haproxy
Jan 26 15:13:02 compute-1 neutron-haproxy-ovnmeta-d4a37c9f-5b64-4f94-80e9-126c911b1acf[205425]: [WARNING]  (205429) : Exiting Master process...
Jan 26 15:13:02 compute-1 podman[207033]: 2026-01-26 15:13:02.682622795 +0000 UTC m=+0.035851929 container kill feabdf5b0994c1edeabaecd875864b5ee430fe643f24b2538ab65140f11b1ab5 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d4a37c9f-5b64-4f94-80e9-126c911b1acf, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2)
Jan 26 15:13:02 compute-1 neutron-haproxy-ovnmeta-d4a37c9f-5b64-4f94-80e9-126c911b1acf[205425]: [ALERT]    (205429) : Current worker (205431) exited with code 143 (Terminated)
Jan 26 15:13:02 compute-1 neutron-haproxy-ovnmeta-d4a37c9f-5b64-4f94-80e9-126c911b1acf[205425]: [WARNING]  (205429) : All workers exited. Exiting... (0)
Jan 26 15:13:02 compute-1 systemd[1]: libpod-feabdf5b0994c1edeabaecd875864b5ee430fe643f24b2538ab65140f11b1ab5.scope: Deactivated successfully.
Jan 26 15:13:02 compute-1 kernel: tap6dd62b2f-19: entered promiscuous mode
Jan 26 15:13:02 compute-1 kernel: tap6dd62b2f-19 (unregistering): left promiscuous mode
Jan 26 15:13:02 compute-1 NetworkManager[55716]: <info>  [1769440382.7028] manager: (tap6dd62b2f-19): new Tun device (/org/freedesktop/NetworkManager/Devices/38)
Jan 26 15:13:02 compute-1 nova_compute[183403]: 2026-01-26 15:13:02.704 183407 DEBUG nova.compute.manager [req-60c48206-15e4-4ae6-9bad-6db42aa48cef req-acaebb45-5b76-4bce-a4f6-3b9084f62b62 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8c64a2e0-f723-4adb-84fc-867073a92349] Received event network-vif-unplugged-6dd62b2f-1957-4fa5-92d8-6a7d131f0d09 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:13:02 compute-1 nova_compute[183403]: 2026-01-26 15:13:02.705 183407 DEBUG oslo_concurrency.lockutils [req-60c48206-15e4-4ae6-9bad-6db42aa48cef req-acaebb45-5b76-4bce-a4f6-3b9084f62b62 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "8c64a2e0-f723-4adb-84fc-867073a92349-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:13:02 compute-1 nova_compute[183403]: 2026-01-26 15:13:02.705 183407 DEBUG oslo_concurrency.lockutils [req-60c48206-15e4-4ae6-9bad-6db42aa48cef req-acaebb45-5b76-4bce-a4f6-3b9084f62b62 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "8c64a2e0-f723-4adb-84fc-867073a92349-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:13:02 compute-1 nova_compute[183403]: 2026-01-26 15:13:02.705 183407 DEBUG oslo_concurrency.lockutils [req-60c48206-15e4-4ae6-9bad-6db42aa48cef req-acaebb45-5b76-4bce-a4f6-3b9084f62b62 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "8c64a2e0-f723-4adb-84fc-867073a92349-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:13:02 compute-1 nova_compute[183403]: 2026-01-26 15:13:02.705 183407 DEBUG nova.compute.manager [req-60c48206-15e4-4ae6-9bad-6db42aa48cef req-acaebb45-5b76-4bce-a4f6-3b9084f62b62 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8c64a2e0-f723-4adb-84fc-867073a92349] No waiting events found dispatching network-vif-unplugged-6dd62b2f-1957-4fa5-92d8-6a7d131f0d09 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:13:02 compute-1 nova_compute[183403]: 2026-01-26 15:13:02.705 183407 DEBUG nova.compute.manager [req-60c48206-15e4-4ae6-9bad-6db42aa48cef req-acaebb45-5b76-4bce-a4f6-3b9084f62b62 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8c64a2e0-f723-4adb-84fc-867073a92349] Received event network-vif-unplugged-6dd62b2f-1957-4fa5-92d8-6a7d131f0d09 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 15:13:02 compute-1 nova_compute[183403]: 2026-01-26 15:13:02.708 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:13:02 compute-1 nova_compute[183403]: 2026-01-26 15:13:02.742 183407 INFO nova.virt.libvirt.driver [-] [instance: 8c64a2e0-f723-4adb-84fc-867073a92349] Instance destroyed successfully.
Jan 26 15:13:02 compute-1 nova_compute[183403]: 2026-01-26 15:13:02.742 183407 DEBUG nova.objects.instance [None req-429e8eb9-b26d-47ce-a419-03182ae65d84 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lazy-loading 'resources' on Instance uuid 8c64a2e0-f723-4adb-84fc-867073a92349 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:13:02 compute-1 podman[207052]: 2026-01-26 15:13:02.745182479 +0000 UTC m=+0.037920296 container died feabdf5b0994c1edeabaecd875864b5ee430fe643f24b2538ab65140f11b1ab5 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d4a37c9f-5b64-4f94-80e9-126c911b1acf, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260120, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Jan 26 15:13:02 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-feabdf5b0994c1edeabaecd875864b5ee430fe643f24b2538ab65140f11b1ab5-userdata-shm.mount: Deactivated successfully.
Jan 26 15:13:02 compute-1 systemd[1]: var-lib-containers-storage-overlay-586c742fcf2df3da6069b7da1e64cb5522dd4fc218e9b835e719d2c137bf039e-merged.mount: Deactivated successfully.
Jan 26 15:13:03 compute-1 podman[207052]: 2026-01-26 15:13:03.058770285 +0000 UTC m=+0.351508092 container cleanup feabdf5b0994c1edeabaecd875864b5ee430fe643f24b2538ab65140f11b1ab5 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d4a37c9f-5b64-4f94-80e9-126c911b1acf, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260120, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 15:13:03 compute-1 systemd[1]: libpod-conmon-feabdf5b0994c1edeabaecd875864b5ee430fe643f24b2538ab65140f11b1ab5.scope: Deactivated successfully.
Jan 26 15:13:03 compute-1 nova_compute[183403]: 2026-01-26 15:13:03.089 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:13:03 compute-1 nova_compute[183403]: 2026-01-26 15:13:03.090 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:13:03 compute-1 nova_compute[183403]: 2026-01-26 15:13:03.090 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:13:03 compute-1 nova_compute[183403]: 2026-01-26 15:13:03.090 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:13:03 compute-1 podman[207074]: 2026-01-26 15:13:03.184281792 +0000 UTC m=+0.445197559 container remove feabdf5b0994c1edeabaecd875864b5ee430fe643f24b2538ab65140f11b1ab5 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d4a37c9f-5b64-4f94-80e9-126c911b1acf, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260120)
Jan 26 15:13:03 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:13:03.190 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[d359ba09-5c89-42b0-8cc5-6915d91def48]: (4, ("Mon Jan 26 03:13:02 PM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-d4a37c9f-5b64-4f94-80e9-126c911b1acf (feabdf5b0994c1edeabaecd875864b5ee430fe643f24b2538ab65140f11b1ab5)\nfeabdf5b0994c1edeabaecd875864b5ee430fe643f24b2538ab65140f11b1ab5\nMon Jan 26 03:13:02 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-d4a37c9f-5b64-4f94-80e9-126c911b1acf (feabdf5b0994c1edeabaecd875864b5ee430fe643f24b2538ab65140f11b1ab5)\nfeabdf5b0994c1edeabaecd875864b5ee430fe643f24b2538ab65140f11b1ab5\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:13:03 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:13:03.192 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[bdafabfb-265a-4337-96fc-892b9b4f9281]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:13:03 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:13:03.192 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d4a37c9f-5b64-4f94-80e9-126c911b1acf.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d4a37c9f-5b64-4f94-80e9-126c911b1acf.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:13:03 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:13:03.193 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[d291eeca-9b4c-4362-9586-328767b015b7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:13:03 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:13:03.193 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd4a37c9f-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:13:03 compute-1 nova_compute[183403]: 2026-01-26 15:13:03.195 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:13:03 compute-1 kernel: tapd4a37c9f-50: left promiscuous mode
Jan 26 15:13:03 compute-1 nova_compute[183403]: 2026-01-26 15:13:03.213 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:13:03 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:13:03.218 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[9de37a6c-5676-4b62-8a2e-4af90b38f846]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:13:03 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:13:03.234 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[7c11e4e9-1cf5-43e1-97a5-d856ac0fde86]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:13:03 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:13:03.236 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[0aa12a55-c0d1-458f-9401-e4bb29bdba6c]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:13:03 compute-1 nova_compute[183403]: 2026-01-26 15:13:03.251 183407 DEBUG nova.virt.libvirt.vif [None req-429e8eb9-b26d-47ce-a419-03182ae65d84 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2026-01-26T15:08:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-737487078',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-737487078',id=4,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T15:09:31Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6377892a338d4a7cbe63cf30bd2c63ea',ramdisk_id='',reservation_id='r-hm0fd26d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,reader,member,admin',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-280856547',owner_user_name='tempest-TestExecuteActionsViaActuator-280856547-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T15:09:45Z,user_data=None,user_id='afb4f4811cb043dca89a8413c390ba3d',uuid=8c64a2e0-f723-4adb-84fc-867073a92349,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6dd62b2f-1957-4fa5-92d8-6a7d131f0d09", "address": "fa:16:3e:91:f3:28", "network": {"id": "d4a37c9f-5b64-4f94-80e9-126c911b1acf", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7dc5e9070a084dfcb543a08e87868f39", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6dd62b2f-19", "ovs_interfaceid": "6dd62b2f-1957-4fa5-92d8-6a7d131f0d09", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 26 15:13:03 compute-1 nova_compute[183403]: 2026-01-26 15:13:03.252 183407 DEBUG nova.network.os_vif_util [None req-429e8eb9-b26d-47ce-a419-03182ae65d84 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Converting VIF {"id": "6dd62b2f-1957-4fa5-92d8-6a7d131f0d09", "address": "fa:16:3e:91:f3:28", "network": {"id": "d4a37c9f-5b64-4f94-80e9-126c911b1acf", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1446571951-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7dc5e9070a084dfcb543a08e87868f39", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6dd62b2f-19", "ovs_interfaceid": "6dd62b2f-1957-4fa5-92d8-6a7d131f0d09", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:13:03 compute-1 nova_compute[183403]: 2026-01-26 15:13:03.253 183407 DEBUG nova.network.os_vif_util [None req-429e8eb9-b26d-47ce-a419-03182ae65d84 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:91:f3:28,bridge_name='br-int',has_traffic_filtering=True,id=6dd62b2f-1957-4fa5-92d8-6a7d131f0d09,network=Network(d4a37c9f-5b64-4f94-80e9-126c911b1acf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6dd62b2f-19') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:13:03 compute-1 nova_compute[183403]: 2026-01-26 15:13:03.254 183407 DEBUG os_vif [None req-429e8eb9-b26d-47ce-a419-03182ae65d84 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:91:f3:28,bridge_name='br-int',has_traffic_filtering=True,id=6dd62b2f-1957-4fa5-92d8-6a7d131f0d09,network=Network(d4a37c9f-5b64-4f94-80e9-126c911b1acf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6dd62b2f-19') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 26 15:13:03 compute-1 nova_compute[183403]: 2026-01-26 15:13:03.256 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:13:03 compute-1 nova_compute[183403]: 2026-01-26 15:13:03.257 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6dd62b2f-19, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:13:03 compute-1 nova_compute[183403]: 2026-01-26 15:13:03.259 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:13:03 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:13:03.260 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[995651f9-f884-4c62-8465-523714ff2c55]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 385307, 'reachable_time': 35955, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 207103, 'error': None, 'target': 'ovnmeta-d4a37c9f-5b64-4f94-80e9-126c911b1acf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:13:03 compute-1 nova_compute[183403]: 2026-01-26 15:13:03.262 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:13:03 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:13:03.263 105448 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d4a37c9f-5b64-4f94-80e9-126c911b1acf deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Jan 26 15:13:03 compute-1 nova_compute[183403]: 2026-01-26 15:13:03.263 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:13:03 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:13:03.263 105448 DEBUG oslo.privsep.daemon [-] privsep: reply[1ed82a28-e929-4c27-8d05-dc6664b6b9bd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:13:03 compute-1 nova_compute[183403]: 2026-01-26 15:13:03.264 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=c1eb6b98-b987-44d8-ad5e-8de75dbf5921) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:13:03 compute-1 systemd[1]: run-netns-ovnmeta\x2dd4a37c9f\x2d5b64\x2d4f94\x2d80e9\x2d126c911b1acf.mount: Deactivated successfully.
Jan 26 15:13:03 compute-1 nova_compute[183403]: 2026-01-26 15:13:03.265 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:13:03 compute-1 nova_compute[183403]: 2026-01-26 15:13:03.267 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:13:03 compute-1 nova_compute[183403]: 2026-01-26 15:13:03.270 183407 INFO os_vif [None req-429e8eb9-b26d-47ce-a419-03182ae65d84 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:91:f3:28,bridge_name='br-int',has_traffic_filtering=True,id=6dd62b2f-1957-4fa5-92d8-6a7d131f0d09,network=Network(d4a37c9f-5b64-4f94-80e9-126c911b1acf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6dd62b2f-19')
Jan 26 15:13:03 compute-1 nova_compute[183403]: 2026-01-26 15:13:03.271 183407 INFO nova.virt.libvirt.driver [None req-429e8eb9-b26d-47ce-a419-03182ae65d84 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 8c64a2e0-f723-4adb-84fc-867073a92349] Deleting instance files /var/lib/nova/instances/8c64a2e0-f723-4adb-84fc-867073a92349_del
Jan 26 15:13:03 compute-1 nova_compute[183403]: 2026-01-26 15:13:03.275 183407 INFO nova.virt.libvirt.driver [None req-429e8eb9-b26d-47ce-a419-03182ae65d84 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 8c64a2e0-f723-4adb-84fc-867073a92349] Deletion of /var/lib/nova/instances/8c64a2e0-f723-4adb-84fc-867073a92349_del complete
Jan 26 15:13:03 compute-1 nova_compute[183403]: 2026-01-26 15:13:03.785 183407 INFO nova.compute.manager [None req-429e8eb9-b26d-47ce-a419-03182ae65d84 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] [instance: 8c64a2e0-f723-4adb-84fc-867073a92349] Took 1.30 seconds to destroy the instance on the hypervisor.
Jan 26 15:13:03 compute-1 nova_compute[183403]: 2026-01-26 15:13:03.786 183407 DEBUG oslo.service.backend._eventlet.loopingcall [None req-429e8eb9-b26d-47ce-a419-03182ae65d84 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Jan 26 15:13:03 compute-1 nova_compute[183403]: 2026-01-26 15:13:03.786 183407 DEBUG nova.compute.manager [-] [instance: 8c64a2e0-f723-4adb-84fc-867073a92349] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Jan 26 15:13:03 compute-1 nova_compute[183403]: 2026-01-26 15:13:03.786 183407 DEBUG nova.network.neutron [-] [instance: 8c64a2e0-f723-4adb-84fc-867073a92349] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Jan 26 15:13:03 compute-1 nova_compute[183403]: 2026-01-26 15:13:03.787 183407 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:13:04 compute-1 nova_compute[183403]: 2026-01-26 15:13:04.103 183407 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:13:04 compute-1 nova_compute[183403]: 2026-01-26 15:13:04.155 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Error from libvirt while getting description of instance-00000004: [Error Code 42] Domain not found: no domain with matching uuid '8c64a2e0-f723-4adb-84fc-867073a92349' (instance-00000004): libvirt.libvirtError: Domain not found: no domain with matching uuid '8c64a2e0-f723-4adb-84fc-867073a92349' (instance-00000004)
Jan 26 15:13:04 compute-1 nova_compute[183403]: 2026-01-26 15:13:04.327 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:13:04 compute-1 nova_compute[183403]: 2026-01-26 15:13:04.329 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:13:04 compute-1 nova_compute[183403]: 2026-01-26 15:13:04.356 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.027s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:13:04 compute-1 nova_compute[183403]: 2026-01-26 15:13:04.357 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5809MB free_disk=73.12031936645508GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:13:04 compute-1 nova_compute[183403]: 2026-01-26 15:13:04.357 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:13:04 compute-1 nova_compute[183403]: 2026-01-26 15:13:04.358 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:13:04 compute-1 nova_compute[183403]: 2026-01-26 15:13:04.537 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:13:04 compute-1 nova_compute[183403]: 2026-01-26 15:13:04.848 183407 DEBUG nova.network.neutron [-] [instance: 8c64a2e0-f723-4adb-84fc-867073a92349] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:13:05 compute-1 nova_compute[183403]: 2026-01-26 15:13:05.346 183407 DEBUG nova.compute.manager [req-ff1e6892-f465-4b66-ab39-3d4be78185b7 req-adbcf5a6-cd70-4550-a608-3cc287037f88 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8c64a2e0-f723-4adb-84fc-867073a92349] Received event network-vif-unplugged-6dd62b2f-1957-4fa5-92d8-6a7d131f0d09 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:13:05 compute-1 nova_compute[183403]: 2026-01-26 15:13:05.347 183407 DEBUG oslo_concurrency.lockutils [req-ff1e6892-f465-4b66-ab39-3d4be78185b7 req-adbcf5a6-cd70-4550-a608-3cc287037f88 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "8c64a2e0-f723-4adb-84fc-867073a92349-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:13:05 compute-1 nova_compute[183403]: 2026-01-26 15:13:05.347 183407 DEBUG oslo_concurrency.lockutils [req-ff1e6892-f465-4b66-ab39-3d4be78185b7 req-adbcf5a6-cd70-4550-a608-3cc287037f88 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "8c64a2e0-f723-4adb-84fc-867073a92349-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:13:05 compute-1 nova_compute[183403]: 2026-01-26 15:13:05.347 183407 DEBUG oslo_concurrency.lockutils [req-ff1e6892-f465-4b66-ab39-3d4be78185b7 req-adbcf5a6-cd70-4550-a608-3cc287037f88 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "8c64a2e0-f723-4adb-84fc-867073a92349-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:13:05 compute-1 nova_compute[183403]: 2026-01-26 15:13:05.348 183407 DEBUG nova.compute.manager [req-ff1e6892-f465-4b66-ab39-3d4be78185b7 req-adbcf5a6-cd70-4550-a608-3cc287037f88 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8c64a2e0-f723-4adb-84fc-867073a92349] No waiting events found dispatching network-vif-unplugged-6dd62b2f-1957-4fa5-92d8-6a7d131f0d09 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:13:05 compute-1 nova_compute[183403]: 2026-01-26 15:13:05.348 183407 DEBUG nova.compute.manager [req-ff1e6892-f465-4b66-ab39-3d4be78185b7 req-adbcf5a6-cd70-4550-a608-3cc287037f88 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8c64a2e0-f723-4adb-84fc-867073a92349] Received event network-vif-unplugged-6dd62b2f-1957-4fa5-92d8-6a7d131f0d09 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 15:13:05 compute-1 nova_compute[183403]: 2026-01-26 15:13:05.348 183407 DEBUG nova.compute.manager [req-ff1e6892-f465-4b66-ab39-3d4be78185b7 req-adbcf5a6-cd70-4550-a608-3cc287037f88 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8c64a2e0-f723-4adb-84fc-867073a92349] Received event network-vif-deleted-6dd62b2f-1957-4fa5-92d8-6a7d131f0d09 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:13:05 compute-1 nova_compute[183403]: 2026-01-26 15:13:05.348 183407 INFO nova.compute.manager [req-ff1e6892-f465-4b66-ab39-3d4be78185b7 req-adbcf5a6-cd70-4550-a608-3cc287037f88 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8c64a2e0-f723-4adb-84fc-867073a92349] Neutron deleted interface 6dd62b2f-1957-4fa5-92d8-6a7d131f0d09; detaching it from the instance and deleting it from the info cache
Jan 26 15:13:05 compute-1 nova_compute[183403]: 2026-01-26 15:13:05.349 183407 DEBUG nova.network.neutron [req-ff1e6892-f465-4b66-ab39-3d4be78185b7 req-adbcf5a6-cd70-4550-a608-3cc287037f88 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8c64a2e0-f723-4adb-84fc-867073a92349] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:13:05 compute-1 podman[192725]: time="2026-01-26T15:13:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:13:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:13:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:13:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:13:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2187 "" "Go-http-client/1.1"
Jan 26 15:13:05 compute-1 nova_compute[183403]: 2026-01-26 15:13:05.930 183407 INFO nova.compute.manager [-] [instance: 8c64a2e0-f723-4adb-84fc-867073a92349] Took 2.14 seconds to deallocate network for instance.
Jan 26 15:13:06 compute-1 nova_compute[183403]: 2026-01-26 15:13:06.021 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Instance 8c64a2e0-f723-4adb-84fc-867073a92349 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 15:13:06 compute-1 nova_compute[183403]: 2026-01-26 15:13:06.022 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:13:06 compute-1 nova_compute[183403]: 2026-01-26 15:13:06.022 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=704MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:13:04 up  1:08,  0 user,  load average: 0.48, 0.32, 0.38\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_deleting': '1', 'num_os_type_None': '1', 'num_proj_6377892a338d4a7cbe63cf30bd2c63ea': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:13:06 compute-1 nova_compute[183403]: 2026-01-26 15:13:06.067 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:13:06 compute-1 nova_compute[183403]: 2026-01-26 15:13:06.285 183407 DEBUG nova.compute.manager [req-ff1e6892-f465-4b66-ab39-3d4be78185b7 req-adbcf5a6-cd70-4550-a608-3cc287037f88 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8c64a2e0-f723-4adb-84fc-867073a92349] Detach interface failed, port_id=6dd62b2f-1957-4fa5-92d8-6a7d131f0d09, reason: Instance 8c64a2e0-f723-4adb-84fc-867073a92349 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Jan 26 15:13:06 compute-1 nova_compute[183403]: 2026-01-26 15:13:06.650 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:13:06 compute-1 nova_compute[183403]: 2026-01-26 15:13:06.669 183407 DEBUG oslo_concurrency.lockutils [None req-429e8eb9-b26d-47ce-a419-03182ae65d84 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:13:07 compute-1 nova_compute[183403]: 2026-01-26 15:13:07.394 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:13:07 compute-1 nova_compute[183403]: 2026-01-26 15:13:07.394 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.037s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:13:07 compute-1 nova_compute[183403]: 2026-01-26 15:13:07.395 183407 DEBUG oslo_concurrency.lockutils [None req-429e8eb9-b26d-47ce-a419-03182ae65d84 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.726s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:13:07 compute-1 nova_compute[183403]: 2026-01-26 15:13:07.431 183407 DEBUG nova.compute.provider_tree [None req-429e8eb9-b26d-47ce-a419-03182ae65d84 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:13:08 compute-1 nova_compute[183403]: 2026-01-26 15:13:08.266 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:13:08 compute-1 nova_compute[183403]: 2026-01-26 15:13:08.999 183407 DEBUG nova.scheduler.client.report [None req-429e8eb9-b26d-47ce-a419-03182ae65d84 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:13:09 compute-1 nova_compute[183403]: 2026-01-26 15:13:09.539 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:13:09 compute-1 nova_compute[183403]: 2026-01-26 15:13:09.655 183407 DEBUG oslo_concurrency.lockutils [None req-429e8eb9-b26d-47ce-a419-03182ae65d84 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 2.261s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:13:09 compute-1 nova_compute[183403]: 2026-01-26 15:13:09.922 183407 INFO nova.scheduler.client.report [None req-429e8eb9-b26d-47ce-a419-03182ae65d84 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Deleted allocations for instance 8c64a2e0-f723-4adb-84fc-867073a92349
Jan 26 15:13:10 compute-1 nova_compute[183403]: 2026-01-26 15:13:10.395 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:13:10 compute-1 nova_compute[183403]: 2026-01-26 15:13:10.396 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:13:10 compute-1 nova_compute[183403]: 2026-01-26 15:13:10.396 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:13:10 compute-1 nova_compute[183403]: 2026-01-26 15:13:10.397 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:13:10 compute-1 nova_compute[183403]: 2026-01-26 15:13:10.397 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 15:13:10 compute-1 nova_compute[183403]: 2026-01-26 15:13:10.574 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:13:11 compute-1 nova_compute[183403]: 2026-01-26 15:13:11.550 183407 DEBUG oslo_concurrency.lockutils [None req-429e8eb9-b26d-47ce-a419-03182ae65d84 afb4f4811cb043dca89a8413c390ba3d 6377892a338d4a7cbe63cf30bd2c63ea - - default default] Lock "8c64a2e0-f723-4adb-84fc-867073a92349" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 9.605s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:13:11 compute-1 nova_compute[183403]: 2026-01-26 15:13:11.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:13:13 compute-1 nova_compute[183403]: 2026-01-26 15:13:13.268 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:13:14 compute-1 nova_compute[183403]: 2026-01-26 15:13:14.541 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:13:15 compute-1 sshd-session[207105]: Invalid user alireza from 185.246.128.170 port 59493
Jan 26 15:13:16 compute-1 sshd-session[207107]: Invalid user ubuntu from 80.94.92.171 port 35696
Jan 26 15:13:16 compute-1 sshd-session[207107]: Connection closed by invalid user ubuntu 80.94.92.171 port 35696 [preauth]
Jan 26 15:13:17 compute-1 sshd-session[207105]: Disconnecting invalid user alireza 185.246.128.170 port 59493: Change of username or service not allowed: (alireza,ssh-connection) -> (testuser,ssh-connection) [preauth]
Jan 26 15:13:18 compute-1 nova_compute[183403]: 2026-01-26 15:13:18.270 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:13:18 compute-1 nova_compute[183403]: 2026-01-26 15:13:18.369 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:13:19 compute-1 openstack_network_exporter[195610]: ERROR   15:13:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:13:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:13:19 compute-1 openstack_network_exporter[195610]: ERROR   15:13:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:13:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:13:19 compute-1 nova_compute[183403]: 2026-01-26 15:13:19.544 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:13:23 compute-1 nova_compute[183403]: 2026-01-26 15:13:23.271 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:13:23 compute-1 podman[207111]: 2026-01-26 15:13:23.889120452 +0000 UTC m=+0.062940833 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, version=9.6, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 26 15:13:23 compute-1 podman[207110]: 2026-01-26 15:13:23.889264226 +0000 UTC m=+0.062405402 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 15:13:24 compute-1 nova_compute[183403]: 2026-01-26 15:13:24.546 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:13:28 compute-1 nova_compute[183403]: 2026-01-26 15:13:28.274 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:13:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:13:29.039 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:13:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:13:29.040 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:13:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:13:29.040 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:13:29 compute-1 nova_compute[183403]: 2026-01-26 15:13:29.549 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:13:32 compute-1 sshd-session[207109]: Invalid user testuser from 185.246.128.170 port 63547
Jan 26 15:13:32 compute-1 podman[207159]: 2026-01-26 15:13:32.883663385 +0000 UTC m=+0.060899839 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 26 15:13:32 compute-1 podman[207158]: 2026-01-26 15:13:32.902623976 +0000 UTC m=+0.085620358 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Jan 26 15:13:33 compute-1 nova_compute[183403]: 2026-01-26 15:13:33.276 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:13:34 compute-1 nova_compute[183403]: 2026-01-26 15:13:34.550 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:13:35 compute-1 sshd-session[207109]: Disconnecting invalid user testuser 185.246.128.170 port 63547: Change of username or service not allowed: (testuser,ssh-connection) -> (system,ssh-connection) [preauth]
Jan 26 15:13:35 compute-1 podman[192725]: time="2026-01-26T15:13:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:13:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:13:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:13:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:13:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2189 "" "Go-http-client/1.1"
Jan 26 15:13:35 compute-1 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 26 15:13:38 compute-1 nova_compute[183403]: 2026-01-26 15:13:38.278 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:13:39 compute-1 nova_compute[183403]: 2026-01-26 15:13:39.552 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:13:39 compute-1 sshd-session[207205]: Invalid user system from 185.246.128.170 port 34030
Jan 26 15:13:40 compute-1 sshd-session[207205]: Disconnecting invalid user system 185.246.128.170 port 34030: Change of username or service not allowed: (system,ssh-connection) -> (ftpuser,ssh-connection) [preauth]
Jan 26 15:13:40 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:13:40.987 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bd:09:69 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-95b60e5a-ede2-4bc6-80f8-81c875c613ca', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-95b60e5a-ede2-4bc6-80f8-81c875c613ca', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1eff4cc862a94ba0b9bdd2e7cd089d3b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=078a9aa2-ec53-4a6e-9fce-171c59eb7b8d, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=f8cc5f9d-b902-4d9f-8c23-350733c66989) old=Port_Binding(mac=['fa:16:3e:bd:09:69'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-95b60e5a-ede2-4bc6-80f8-81c875c613ca', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-95b60e5a-ede2-4bc6-80f8-81c875c613ca', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1eff4cc862a94ba0b9bdd2e7cd089d3b', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:13:40 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:13:40.988 104930 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port f8cc5f9d-b902-4d9f-8c23-350733c66989 in datapath 95b60e5a-ede2-4bc6-80f8-81c875c613ca updated
Jan 26 15:13:40 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:13:40.988 104930 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 95b60e5a-ede2-4bc6-80f8-81c875c613ca, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 15:13:40 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:13:40.989 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[013c5a65-1d58-48a6-80bf-8c24bded9d26]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:13:43 compute-1 nova_compute[183403]: 2026-01-26 15:13:43.281 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:13:44 compute-1 nova_compute[183403]: 2026-01-26 15:13:44.555 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:13:46 compute-1 sshd-session[207207]: Invalid user ftpuser from 185.246.128.170 port 28179
Jan 26 15:13:47 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:13:47.850 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:ac:38', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'de:96:40:90:f3:e5'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:13:47 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:13:47.851 104930 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 15:13:47 compute-1 nova_compute[183403]: 2026-01-26 15:13:47.852 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:13:48 compute-1 nova_compute[183403]: 2026-01-26 15:13:48.283 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:13:49 compute-1 openstack_network_exporter[195610]: ERROR   15:13:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:13:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:13:49 compute-1 openstack_network_exporter[195610]: ERROR   15:13:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:13:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:13:49 compute-1 nova_compute[183403]: 2026-01-26 15:13:49.557 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:13:51 compute-1 sshd-session[207207]: Disconnecting invalid user ftpuser 185.246.128.170 port 28179: Change of username or service not allowed: (ftpuser,ssh-connection) -> (zxcloudsetup,ssh-connection) [preauth]
Jan 26 15:13:53 compute-1 nova_compute[183403]: 2026-01-26 15:13:53.285 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:13:54 compute-1 nova_compute[183403]: 2026-01-26 15:13:54.559 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:13:54 compute-1 ovn_controller[95641]: 2026-01-26T15:13:54Z|00094|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 26 15:13:54 compute-1 podman[207211]: 2026-01-26 15:13:54.884701832 +0000 UTC m=+0.057865509 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 15:13:54 compute-1 podman[207212]: 2026-01-26 15:13:54.893972639 +0000 UTC m=+0.064610781 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.openshift.expose-services=, managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, name=ubi9-minimal, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container)
Jan 26 15:13:55 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:13:55.901 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c7:6a:99 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-8cba9272-8ed8-4ace-9439-ba5a7a5f85b5', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8cba9272-8ed8-4ace-9439-ba5a7a5f85b5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '53b3f69d4de4466089ba08d308b821a1', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e301ca5b-cc54-4075-ba42-31b675603f26, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=e2cf2c65-d86e-406d-91b6-152fe2392ec0) old=Port_Binding(mac=['fa:16:3e:c7:6a:99'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-8cba9272-8ed8-4ace-9439-ba5a7a5f85b5', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8cba9272-8ed8-4ace-9439-ba5a7a5f85b5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '53b3f69d4de4466089ba08d308b821a1', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:13:55 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:13:55.902 104930 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port e2cf2c65-d86e-406d-91b6-152fe2392ec0 in datapath 8cba9272-8ed8-4ace-9439-ba5a7a5f85b5 updated
Jan 26 15:13:55 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:13:55.903 104930 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8cba9272-8ed8-4ace-9439-ba5a7a5f85b5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 15:13:55 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:13:55.904 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[abbe9709-ad5f-4aa0-b8aa-ce7be49a176b]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:13:56 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:13:56.853 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=41380b5a-e321-4ce4-bcc6-ecd563b3c793, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:13:58 compute-1 nova_compute[183403]: 2026-01-26 15:13:58.286 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:13:59 compute-1 nova_compute[183403]: 2026-01-26 15:13:59.561 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:14:03 compute-1 nova_compute[183403]: 2026-01-26 15:14:03.288 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:14:03 compute-1 nova_compute[183403]: 2026-01-26 15:14:03.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:14:03 compute-1 podman[207254]: 2026-01-26 15:14:03.907802507 +0000 UTC m=+0.090839480 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 26 15:14:03 compute-1 podman[207255]: 2026-01-26 15:14:03.904482343 +0000 UTC m=+0.086128950 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=watcher_latest)
Jan 26 15:14:04 compute-1 nova_compute[183403]: 2026-01-26 15:14:04.566 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:14:04 compute-1 nova_compute[183403]: 2026-01-26 15:14:04.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:14:04 compute-1 sshd-session[207251]: Invalid user zxcloudsetup from 185.246.128.170 port 20596
Jan 26 15:14:05 compute-1 nova_compute[183403]: 2026-01-26 15:14:05.125 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:14:05 compute-1 nova_compute[183403]: 2026-01-26 15:14:05.126 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:14:05 compute-1 nova_compute[183403]: 2026-01-26 15:14:05.126 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:14:05 compute-1 nova_compute[183403]: 2026-01-26 15:14:05.127 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:14:05 compute-1 nova_compute[183403]: 2026-01-26 15:14:05.270 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:14:05 compute-1 nova_compute[183403]: 2026-01-26 15:14:05.271 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:14:05 compute-1 nova_compute[183403]: 2026-01-26 15:14:05.290 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.019s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:14:05 compute-1 nova_compute[183403]: 2026-01-26 15:14:05.291 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5852MB free_disk=73.14895629882812GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:14:05 compute-1 nova_compute[183403]: 2026-01-26 15:14:05.291 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:14:05 compute-1 nova_compute[183403]: 2026-01-26 15:14:05.292 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:14:05 compute-1 podman[192725]: time="2026-01-26T15:14:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:14:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:14:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:14:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:14:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2188 "" "Go-http-client/1.1"
Jan 26 15:14:06 compute-1 nova_compute[183403]: 2026-01-26 15:14:06.681 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:14:06 compute-1 nova_compute[183403]: 2026-01-26 15:14:06.681 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:14:05 up  1:09,  0 user,  load average: 0.17, 0.26, 0.35\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:14:06 compute-1 nova_compute[183403]: 2026-01-26 15:14:06.704 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:14:07 compute-1 nova_compute[183403]: 2026-01-26 15:14:07.211 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:14:07 compute-1 sshd-session[207251]: Disconnecting invalid user zxcloudsetup 185.246.128.170 port 20596: Change of username or service not allowed: (zxcloudsetup,ssh-connection) -> (monitor,ssh-connection) [preauth]
Jan 26 15:14:07 compute-1 nova_compute[183403]: 2026-01-26 15:14:07.724 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:14:07 compute-1 nova_compute[183403]: 2026-01-26 15:14:07.725 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.433s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:14:08 compute-1 nova_compute[183403]: 2026-01-26 15:14:08.290 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:14:09 compute-1 nova_compute[183403]: 2026-01-26 15:14:09.565 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:14:10 compute-1 nova_compute[183403]: 2026-01-26 15:14:10.725 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:14:11 compute-1 nova_compute[183403]: 2026-01-26 15:14:11.237 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:14:11 compute-1 nova_compute[183403]: 2026-01-26 15:14:11.238 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:14:11 compute-1 nova_compute[183403]: 2026-01-26 15:14:11.238 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:14:11 compute-1 nova_compute[183403]: 2026-01-26 15:14:11.238 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:14:11 compute-1 nova_compute[183403]: 2026-01-26 15:14:11.239 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 15:14:11 compute-1 nova_compute[183403]: 2026-01-26 15:14:11.577 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:14:12 compute-1 nova_compute[183403]: 2026-01-26 15:14:12.573 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:14:13 compute-1 nova_compute[183403]: 2026-01-26 15:14:13.293 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:14:14 compute-1 nova_compute[183403]: 2026-01-26 15:14:14.601 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:14:18 compute-1 nova_compute[183403]: 2026-01-26 15:14:18.294 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:14:19 compute-1 openstack_network_exporter[195610]: ERROR   15:14:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:14:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:14:19 compute-1 openstack_network_exporter[195610]: ERROR   15:14:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:14:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:14:19 compute-1 nova_compute[183403]: 2026-01-26 15:14:19.604 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:14:23 compute-1 nova_compute[183403]: 2026-01-26 15:14:23.296 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:14:24 compute-1 sshd-session[207299]: Invalid user monitor from 185.246.128.170 port 57238
Jan 26 15:14:24 compute-1 nova_compute[183403]: 2026-01-26 15:14:24.607 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:14:25 compute-1 podman[207301]: 2026-01-26 15:14:25.889680874 +0000 UTC m=+0.061776842 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 15:14:25 compute-1 podman[207302]: 2026-01-26 15:14:25.895701275 +0000 UTC m=+0.066102163 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.buildah.version=1.33.7, name=ubi9-minimal, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.tags=minimal rhel9, version=9.6, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 26 15:14:28 compute-1 sshd-session[207299]: Disconnecting invalid user monitor 185.246.128.170 port 57238: Change of username or service not allowed: (monitor,ssh-connection) -> (engineer,ssh-connection) [preauth]
Jan 26 15:14:28 compute-1 nova_compute[183403]: 2026-01-26 15:14:28.298 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:14:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:14:29.041 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:14:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:14:29.041 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:14:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:14:29.041 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:14:29 compute-1 nova_compute[183403]: 2026-01-26 15:14:29.608 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:14:30 compute-1 sshd-session[207343]: Invalid user engineer from 185.246.128.170 port 40474
Jan 26 15:14:30 compute-1 sshd-session[207343]: Disconnecting invalid user engineer 185.246.128.170 port 40474: Change of username or service not allowed: (engineer,ssh-connection) -> (service,ssh-connection) [preauth]
Jan 26 15:14:33 compute-1 nova_compute[183403]: 2026-01-26 15:14:33.300 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:14:34 compute-1 nova_compute[183403]: 2026-01-26 15:14:34.610 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:14:34 compute-1 podman[207349]: 2026-01-26 15:14:34.884653345 +0000 UTC m=+0.053513077 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent)
Jan 26 15:14:34 compute-1 nova_compute[183403]: 2026-01-26 15:14:34.902 183407 DEBUG oslo_concurrency.lockutils [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] Acquiring lock "a4414965-214d-4bdb-8c67-d0774c73ff66" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:14:34 compute-1 nova_compute[183403]: 2026-01-26 15:14:34.903 183407 DEBUG oslo_concurrency.lockutils [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] Lock "a4414965-214d-4bdb-8c67-d0774c73ff66" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:14:34 compute-1 podman[207348]: 2026-01-26 15:14:34.93046555 +0000 UTC m=+0.102017576 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 26 15:14:35 compute-1 nova_compute[183403]: 2026-01-26 15:14:35.408 183407 DEBUG nova.compute.manager [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Jan 26 15:14:35 compute-1 sshd-session[207346]: Invalid user service from 185.246.128.170 port 56777
Jan 26 15:14:35 compute-1 podman[192725]: time="2026-01-26T15:14:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:14:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:14:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:14:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:14:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2191 "" "Go-http-client/1.1"
Jan 26 15:14:35 compute-1 nova_compute[183403]: 2026-01-26 15:14:35.965 183407 DEBUG oslo_concurrency.lockutils [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:14:35 compute-1 nova_compute[183403]: 2026-01-26 15:14:35.965 183407 DEBUG oslo_concurrency.lockutils [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:14:35 compute-1 nova_compute[183403]: 2026-01-26 15:14:35.972 183407 DEBUG nova.virt.hardware [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Jan 26 15:14:35 compute-1 nova_compute[183403]: 2026-01-26 15:14:35.973 183407 INFO nova.compute.claims [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Claim successful on node compute-1.ctlplane.example.com
Jan 26 15:14:36 compute-1 sshd-session[207346]: Disconnecting invalid user service 185.246.128.170 port 56777: Change of username or service not allowed: (service,ssh-connection) -> (cq,ssh-connection) [preauth]
Jan 26 15:14:37 compute-1 nova_compute[183403]: 2026-01-26 15:14:37.038 183407 DEBUG nova.compute.provider_tree [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:14:37 compute-1 nova_compute[183403]: 2026-01-26 15:14:37.549 183407 DEBUG nova.scheduler.client.report [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:14:38 compute-1 nova_compute[183403]: 2026-01-26 15:14:38.060 183407 DEBUG oslo_concurrency.lockutils [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.095s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:14:38 compute-1 nova_compute[183403]: 2026-01-26 15:14:38.062 183407 DEBUG nova.compute.manager [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Jan 26 15:14:38 compute-1 nova_compute[183403]: 2026-01-26 15:14:38.302 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:14:38 compute-1 nova_compute[183403]: 2026-01-26 15:14:38.584 183407 DEBUG nova.compute.manager [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Jan 26 15:14:38 compute-1 nova_compute[183403]: 2026-01-26 15:14:38.585 183407 DEBUG nova.network.neutron [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Jan 26 15:14:38 compute-1 nova_compute[183403]: 2026-01-26 15:14:38.585 183407 WARNING neutronclient.v2_0.client [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:14:38 compute-1 nova_compute[183403]: 2026-01-26 15:14:38.586 183407 WARNING neutronclient.v2_0.client [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:14:39 compute-1 nova_compute[183403]: 2026-01-26 15:14:39.092 183407 INFO nova.virt.libvirt.driver [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 15:14:39 compute-1 nova_compute[183403]: 2026-01-26 15:14:39.343 183407 DEBUG nova.network.neutron [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Successfully created port: 1cb0a587-a627-48da-a9b1-ab7673e447ef _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Jan 26 15:14:39 compute-1 nova_compute[183403]: 2026-01-26 15:14:39.601 183407 DEBUG nova.compute.manager [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Jan 26 15:14:39 compute-1 nova_compute[183403]: 2026-01-26 15:14:39.611 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:14:40 compute-1 nova_compute[183403]: 2026-01-26 15:14:40.648 183407 DEBUG nova.compute.manager [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Jan 26 15:14:40 compute-1 nova_compute[183403]: 2026-01-26 15:14:40.649 183407 DEBUG nova.virt.libvirt.driver [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Jan 26 15:14:40 compute-1 nova_compute[183403]: 2026-01-26 15:14:40.650 183407 INFO nova.virt.libvirt.driver [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Creating image(s)
Jan 26 15:14:40 compute-1 nova_compute[183403]: 2026-01-26 15:14:40.650 183407 DEBUG oslo_concurrency.lockutils [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] Acquiring lock "/var/lib/nova/instances/a4414965-214d-4bdb-8c67-d0774c73ff66/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:14:40 compute-1 nova_compute[183403]: 2026-01-26 15:14:40.651 183407 DEBUG oslo_concurrency.lockutils [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] Lock "/var/lib/nova/instances/a4414965-214d-4bdb-8c67-d0774c73ff66/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:14:40 compute-1 nova_compute[183403]: 2026-01-26 15:14:40.651 183407 DEBUG oslo_concurrency.lockutils [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] Lock "/var/lib/nova/instances/a4414965-214d-4bdb-8c67-d0774c73ff66/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:14:40 compute-1 nova_compute[183403]: 2026-01-26 15:14:40.652 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:14:40 compute-1 nova_compute[183403]: 2026-01-26 15:14:40.655 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:14:40 compute-1 nova_compute[183403]: 2026-01-26 15:14:40.657 183407 DEBUG oslo_concurrency.processutils [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:14:40 compute-1 nova_compute[183403]: 2026-01-26 15:14:40.717 183407 DEBUG oslo_concurrency.processutils [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:14:40 compute-1 nova_compute[183403]: 2026-01-26 15:14:40.719 183407 DEBUG oslo_concurrency.lockutils [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] Acquiring lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:14:40 compute-1 nova_compute[183403]: 2026-01-26 15:14:40.719 183407 DEBUG oslo_concurrency.lockutils [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] Lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:14:40 compute-1 nova_compute[183403]: 2026-01-26 15:14:40.720 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:14:40 compute-1 nova_compute[183403]: 2026-01-26 15:14:40.725 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:14:40 compute-1 nova_compute[183403]: 2026-01-26 15:14:40.725 183407 DEBUG oslo_concurrency.processutils [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:14:40 compute-1 nova_compute[183403]: 2026-01-26 15:14:40.792 183407 DEBUG oslo_concurrency.processutils [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:14:40 compute-1 nova_compute[183403]: 2026-01-26 15:14:40.794 183407 DEBUG oslo_concurrency.processutils [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0,backing_fmt=raw /var/lib/nova/instances/a4414965-214d-4bdb-8c67-d0774c73ff66/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:14:40 compute-1 nova_compute[183403]: 2026-01-26 15:14:40.839 183407 DEBUG oslo_concurrency.processutils [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0,backing_fmt=raw /var/lib/nova/instances/a4414965-214d-4bdb-8c67-d0774c73ff66/disk 1073741824" returned: 0 in 0.045s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:14:40 compute-1 nova_compute[183403]: 2026-01-26 15:14:40.840 183407 DEBUG oslo_concurrency.lockutils [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] Lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.121s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:14:40 compute-1 nova_compute[183403]: 2026-01-26 15:14:40.841 183407 DEBUG oslo_concurrency.processutils [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:14:40 compute-1 nova_compute[183403]: 2026-01-26 15:14:40.911 183407 DEBUG oslo_concurrency.processutils [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:14:40 compute-1 nova_compute[183403]: 2026-01-26 15:14:40.912 183407 DEBUG nova.virt.disk.api [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] Checking if we can resize image /var/lib/nova/instances/a4414965-214d-4bdb-8c67-d0774c73ff66/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 26 15:14:40 compute-1 nova_compute[183403]: 2026-01-26 15:14:40.913 183407 DEBUG oslo_concurrency.processutils [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a4414965-214d-4bdb-8c67-d0774c73ff66/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:14:40 compute-1 nova_compute[183403]: 2026-01-26 15:14:40.922 183407 DEBUG nova.network.neutron [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Successfully updated port: 1cb0a587-a627-48da-a9b1-ab7673e447ef _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Jan 26 15:14:40 compute-1 nova_compute[183403]: 2026-01-26 15:14:40.969 183407 DEBUG oslo_concurrency.processutils [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a4414965-214d-4bdb-8c67-d0774c73ff66/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:14:40 compute-1 nova_compute[183403]: 2026-01-26 15:14:40.970 183407 DEBUG nova.virt.disk.api [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] Cannot resize image /var/lib/nova/instances/a4414965-214d-4bdb-8c67-d0774c73ff66/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 26 15:14:40 compute-1 nova_compute[183403]: 2026-01-26 15:14:40.971 183407 DEBUG nova.virt.libvirt.driver [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Jan 26 15:14:40 compute-1 nova_compute[183403]: 2026-01-26 15:14:40.971 183407 DEBUG nova.virt.libvirt.driver [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Ensure instance console log exists: /var/lib/nova/instances/a4414965-214d-4bdb-8c67-d0774c73ff66/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Jan 26 15:14:40 compute-1 nova_compute[183403]: 2026-01-26 15:14:40.972 183407 DEBUG oslo_concurrency.lockutils [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:14:40 compute-1 nova_compute[183403]: 2026-01-26 15:14:40.972 183407 DEBUG oslo_concurrency.lockutils [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:14:40 compute-1 nova_compute[183403]: 2026-01-26 15:14:40.973 183407 DEBUG oslo_concurrency.lockutils [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:14:40 compute-1 nova_compute[183403]: 2026-01-26 15:14:40.977 183407 DEBUG nova.compute.manager [req-e0dde0ee-e356-4342-bf20-3d752f70fc09 req-37f31748-fe6e-4c1a-a072-ae126a4abe9b 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Received event network-changed-1cb0a587-a627-48da-a9b1-ab7673e447ef external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:14:40 compute-1 nova_compute[183403]: 2026-01-26 15:14:40.977 183407 DEBUG nova.compute.manager [req-e0dde0ee-e356-4342-bf20-3d752f70fc09 req-37f31748-fe6e-4c1a-a072-ae126a4abe9b 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Refreshing instance network info cache due to event network-changed-1cb0a587-a627-48da-a9b1-ab7673e447ef. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 26 15:14:40 compute-1 nova_compute[183403]: 2026-01-26 15:14:40.978 183407 DEBUG oslo_concurrency.lockutils [req-e0dde0ee-e356-4342-bf20-3d752f70fc09 req-37f31748-fe6e-4c1a-a072-ae126a4abe9b 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "refresh_cache-a4414965-214d-4bdb-8c67-d0774c73ff66" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 15:14:40 compute-1 nova_compute[183403]: 2026-01-26 15:14:40.978 183407 DEBUG oslo_concurrency.lockutils [req-e0dde0ee-e356-4342-bf20-3d752f70fc09 req-37f31748-fe6e-4c1a-a072-ae126a4abe9b 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquired lock "refresh_cache-a4414965-214d-4bdb-8c67-d0774c73ff66" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 15:14:40 compute-1 nova_compute[183403]: 2026-01-26 15:14:40.978 183407 DEBUG nova.network.neutron [req-e0dde0ee-e356-4342-bf20-3d752f70fc09 req-37f31748-fe6e-4c1a-a072-ae126a4abe9b 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Refreshing network info cache for port 1cb0a587-a627-48da-a9b1-ab7673e447ef _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 26 15:14:41 compute-1 nova_compute[183403]: 2026-01-26 15:14:41.463 183407 DEBUG oslo_concurrency.lockutils [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] Acquiring lock "refresh_cache-a4414965-214d-4bdb-8c67-d0774c73ff66" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 15:14:41 compute-1 nova_compute[183403]: 2026-01-26 15:14:41.496 183407 WARNING neutronclient.v2_0.client [req-e0dde0ee-e356-4342-bf20-3d752f70fc09 req-37f31748-fe6e-4c1a-a072-ae126a4abe9b 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:14:41 compute-1 nova_compute[183403]: 2026-01-26 15:14:41.632 183407 DEBUG nova.network.neutron [req-e0dde0ee-e356-4342-bf20-3d752f70fc09 req-37f31748-fe6e-4c1a-a072-ae126a4abe9b 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 15:14:42 compute-1 nova_compute[183403]: 2026-01-26 15:14:42.289 183407 DEBUG nova.network.neutron [req-e0dde0ee-e356-4342-bf20-3d752f70fc09 req-37f31748-fe6e-4c1a-a072-ae126a4abe9b 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:14:43 compute-1 nova_compute[183403]: 2026-01-26 15:14:43.060 183407 DEBUG oslo_concurrency.lockutils [req-e0dde0ee-e356-4342-bf20-3d752f70fc09 req-37f31748-fe6e-4c1a-a072-ae126a4abe9b 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Releasing lock "refresh_cache-a4414965-214d-4bdb-8c67-d0774c73ff66" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 15:14:43 compute-1 nova_compute[183403]: 2026-01-26 15:14:43.061 183407 DEBUG oslo_concurrency.lockutils [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] Acquired lock "refresh_cache-a4414965-214d-4bdb-8c67-d0774c73ff66" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 15:14:43 compute-1 nova_compute[183403]: 2026-01-26 15:14:43.061 183407 DEBUG nova.network.neutron [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 15:14:43 compute-1 nova_compute[183403]: 2026-01-26 15:14:43.304 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:14:44 compute-1 nova_compute[183403]: 2026-01-26 15:14:44.156 183407 DEBUG nova.network.neutron [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 15:14:44 compute-1 nova_compute[183403]: 2026-01-26 15:14:44.392 183407 WARNING neutronclient.v2_0.client [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:14:44 compute-1 nova_compute[183403]: 2026-01-26 15:14:44.548 183407 DEBUG nova.network.neutron [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Updating instance_info_cache with network_info: [{"id": "1cb0a587-a627-48da-a9b1-ab7673e447ef", "address": "fa:16:3e:e2:90:44", "network": {"id": "95b60e5a-ede2-4bc6-80f8-81c875c613ca", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-413334478-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1eff4cc862a94ba0b9bdd2e7cd089d3b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1cb0a587-a6", "ovs_interfaceid": "1cb0a587-a627-48da-a9b1-ab7673e447ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:14:44 compute-1 nova_compute[183403]: 2026-01-26 15:14:44.613 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:14:45 compute-1 nova_compute[183403]: 2026-01-26 15:14:45.181 183407 DEBUG oslo_concurrency.lockutils [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] Releasing lock "refresh_cache-a4414965-214d-4bdb-8c67-d0774c73ff66" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 15:14:45 compute-1 nova_compute[183403]: 2026-01-26 15:14:45.182 183407 DEBUG nova.compute.manager [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Instance network_info: |[{"id": "1cb0a587-a627-48da-a9b1-ab7673e447ef", "address": "fa:16:3e:e2:90:44", "network": {"id": "95b60e5a-ede2-4bc6-80f8-81c875c613ca", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-413334478-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1eff4cc862a94ba0b9bdd2e7cd089d3b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1cb0a587-a6", "ovs_interfaceid": "1cb0a587-a627-48da-a9b1-ab7673e447ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Jan 26 15:14:45 compute-1 nova_compute[183403]: 2026-01-26 15:14:45.184 183407 DEBUG nova.virt.libvirt.driver [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Start _get_guest_xml network_info=[{"id": "1cb0a587-a627-48da-a9b1-ab7673e447ef", "address": "fa:16:3e:e2:90:44", "network": {"id": "95b60e5a-ede2-4bc6-80f8-81c875c613ca", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-413334478-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1eff4cc862a94ba0b9bdd2e7cd089d3b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1cb0a587-a6", "ovs_interfaceid": "1cb0a587-a627-48da-a9b1-ab7673e447ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T15:01:22Z,direct_url=<?>,disk_format='qcow2',id=354e4d0e-4287-404f-93d3-2c85cfe92fbc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='179f3c996d8f4e7ea1b0aca3ec76f02e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T15:01:23Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'image_id': '354e4d0e-4287-404f-93d3-2c85cfe92fbc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Jan 26 15:14:45 compute-1 nova_compute[183403]: 2026-01-26 15:14:45.189 183407 WARNING nova.virt.libvirt.driver [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:14:45 compute-1 nova_compute[183403]: 2026-01-26 15:14:45.190 183407 DEBUG nova.virt.driver [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='354e4d0e-4287-404f-93d3-2c85cfe92fbc', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteBasicStrategy-server-919966140', uuid='a4414965-214d-4bdb-8c67-d0774c73ff66'), owner=OwnerMeta(userid='7936c169fc9442e1865811f8febd438d', username='tempest-TestExecuteBasicStrategy-1373478536-project-admin', projectid='53b3f69d4de4466089ba08d308b821a1', projectname='tempest-TestExecuteBasicStrategy-1373478536'), image=ImageMeta(id='354e4d0e-4287-404f-93d3-2c85cfe92fbc', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='74480e15-23e6-4569-8ef9-3ddf5ac8b981', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "1cb0a587-a627-48da-a9b1-ab7673e447ef", "address": "fa:16:3e:e2:90:44", "network": {"id": "95b60e5a-ede2-4bc6-80f8-81c875c613ca", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-413334478-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1eff4cc862a94ba0b9bdd2e7cd089d3b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1cb0a587-a6", "ovs_interfaceid": "1cb0a587-a627-48da-a9b1-ab7673e447ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1769440485.1908195) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Jan 26 15:14:45 compute-1 nova_compute[183403]: 2026-01-26 15:14:45.195 183407 DEBUG nova.virt.libvirt.host [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Jan 26 15:14:45 compute-1 nova_compute[183403]: 2026-01-26 15:14:45.195 183407 DEBUG nova.virt.libvirt.host [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Jan 26 15:14:45 compute-1 nova_compute[183403]: 2026-01-26 15:14:45.199 183407 DEBUG nova.virt.libvirt.host [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Jan 26 15:14:45 compute-1 nova_compute[183403]: 2026-01-26 15:14:45.200 183407 DEBUG nova.virt.libvirt.host [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Jan 26 15:14:45 compute-1 nova_compute[183403]: 2026-01-26 15:14:45.201 183407 DEBUG nova.virt.libvirt.driver [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Jan 26 15:14:45 compute-1 nova_compute[183403]: 2026-01-26 15:14:45.201 183407 DEBUG nova.virt.hardware [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T15:01:21Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='74480e15-23e6-4569-8ef9-3ddf5ac8b981',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T15:01:22Z,direct_url=<?>,disk_format='qcow2',id=354e4d0e-4287-404f-93d3-2c85cfe92fbc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='179f3c996d8f4e7ea1b0aca3ec76f02e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T15:01:23Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Jan 26 15:14:45 compute-1 nova_compute[183403]: 2026-01-26 15:14:45.202 183407 DEBUG nova.virt.hardware [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Jan 26 15:14:45 compute-1 nova_compute[183403]: 2026-01-26 15:14:45.202 183407 DEBUG nova.virt.hardware [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Jan 26 15:14:45 compute-1 nova_compute[183403]: 2026-01-26 15:14:45.202 183407 DEBUG nova.virt.hardware [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Jan 26 15:14:45 compute-1 nova_compute[183403]: 2026-01-26 15:14:45.202 183407 DEBUG nova.virt.hardware [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Jan 26 15:14:45 compute-1 nova_compute[183403]: 2026-01-26 15:14:45.202 183407 DEBUG nova.virt.hardware [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Jan 26 15:14:45 compute-1 nova_compute[183403]: 2026-01-26 15:14:45.203 183407 DEBUG nova.virt.hardware [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Jan 26 15:14:45 compute-1 nova_compute[183403]: 2026-01-26 15:14:45.203 183407 DEBUG nova.virt.hardware [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Jan 26 15:14:45 compute-1 nova_compute[183403]: 2026-01-26 15:14:45.203 183407 DEBUG nova.virt.hardware [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Jan 26 15:14:45 compute-1 nova_compute[183403]: 2026-01-26 15:14:45.204 183407 DEBUG nova.virt.hardware [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Jan 26 15:14:45 compute-1 nova_compute[183403]: 2026-01-26 15:14:45.204 183407 DEBUG nova.virt.hardware [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Jan 26 15:14:45 compute-1 nova_compute[183403]: 2026-01-26 15:14:45.208 183407 DEBUG nova.virt.libvirt.vif [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-01-26T15:14:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-919966140',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-919966140',id=11,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='53b3f69d4de4466089ba08d308b821a1',ramdisk_id='',reservation_id='r-y70rxn4m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,reader,admin,member',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteBasicStrategy-1373478536',owner_user_name='tempest-TestExecuteBasicStrategy-1373478536-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T15:14:39Z,user_data=None,user_id='7936c169fc9442e1865811f8febd438d',uuid=a4414965-214d-4bdb-8c67-d0774c73ff66,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1cb0a587-a627-48da-a9b1-ab7673e447ef", "address": "fa:16:3e:e2:90:44", "network": {"id": "95b60e5a-ede2-4bc6-80f8-81c875c613ca", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-413334478-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1eff4cc862a94ba0b9bdd2e7cd089d3b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1cb0a587-a6", "ovs_interfaceid": "1cb0a587-a627-48da-a9b1-ab7673e447ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 26 15:14:45 compute-1 nova_compute[183403]: 2026-01-26 15:14:45.208 183407 DEBUG nova.network.os_vif_util [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] Converting VIF {"id": "1cb0a587-a627-48da-a9b1-ab7673e447ef", "address": "fa:16:3e:e2:90:44", "network": {"id": "95b60e5a-ede2-4bc6-80f8-81c875c613ca", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-413334478-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1eff4cc862a94ba0b9bdd2e7cd089d3b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1cb0a587-a6", "ovs_interfaceid": "1cb0a587-a627-48da-a9b1-ab7673e447ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:14:45 compute-1 nova_compute[183403]: 2026-01-26 15:14:45.209 183407 DEBUG nova.network.os_vif_util [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e2:90:44,bridge_name='br-int',has_traffic_filtering=True,id=1cb0a587-a627-48da-a9b1-ab7673e447ef,network=Network(95b60e5a-ede2-4bc6-80f8-81c875c613ca),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1cb0a587-a6') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:14:45 compute-1 nova_compute[183403]: 2026-01-26 15:14:45.210 183407 DEBUG nova.objects.instance [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] Lazy-loading 'pci_devices' on Instance uuid a4414965-214d-4bdb-8c67-d0774c73ff66 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:14:45 compute-1 nova_compute[183403]: 2026-01-26 15:14:45.953 183407 DEBUG nova.virt.libvirt.driver [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] End _get_guest_xml xml=<domain type="kvm">
Jan 26 15:14:45 compute-1 nova_compute[183403]:   <uuid>a4414965-214d-4bdb-8c67-d0774c73ff66</uuid>
Jan 26 15:14:45 compute-1 nova_compute[183403]:   <name>instance-0000000b</name>
Jan 26 15:14:45 compute-1 nova_compute[183403]:   <memory>131072</memory>
Jan 26 15:14:45 compute-1 nova_compute[183403]:   <vcpu>1</vcpu>
Jan 26 15:14:45 compute-1 nova_compute[183403]:   <metadata>
Jan 26 15:14:45 compute-1 nova_compute[183403]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 15:14:45 compute-1 nova_compute[183403]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 15:14:45 compute-1 nova_compute[183403]:       <nova:name>tempest-TestExecuteBasicStrategy-server-919966140</nova:name>
Jan 26 15:14:45 compute-1 nova_compute[183403]:       <nova:creationTime>2026-01-26 15:14:45</nova:creationTime>
Jan 26 15:14:45 compute-1 nova_compute[183403]:       <nova:flavor name="m1.nano" id="74480e15-23e6-4569-8ef9-3ddf5ac8b981">
Jan 26 15:14:45 compute-1 nova_compute[183403]:         <nova:memory>128</nova:memory>
Jan 26 15:14:45 compute-1 nova_compute[183403]:         <nova:disk>1</nova:disk>
Jan 26 15:14:45 compute-1 nova_compute[183403]:         <nova:swap>0</nova:swap>
Jan 26 15:14:45 compute-1 nova_compute[183403]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 15:14:45 compute-1 nova_compute[183403]:         <nova:vcpus>1</nova:vcpus>
Jan 26 15:14:45 compute-1 nova_compute[183403]:         <nova:extraSpecs>
Jan 26 15:14:45 compute-1 nova_compute[183403]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 15:14:45 compute-1 nova_compute[183403]:         </nova:extraSpecs>
Jan 26 15:14:45 compute-1 nova_compute[183403]:       </nova:flavor>
Jan 26 15:14:45 compute-1 nova_compute[183403]:       <nova:image uuid="354e4d0e-4287-404f-93d3-2c85cfe92fbc">
Jan 26 15:14:45 compute-1 nova_compute[183403]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 15:14:45 compute-1 nova_compute[183403]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 15:14:45 compute-1 nova_compute[183403]:         <nova:minDisk>1</nova:minDisk>
Jan 26 15:14:45 compute-1 nova_compute[183403]:         <nova:minRam>0</nova:minRam>
Jan 26 15:14:45 compute-1 nova_compute[183403]:         <nova:properties>
Jan 26 15:14:45 compute-1 nova_compute[183403]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 15:14:45 compute-1 nova_compute[183403]:         </nova:properties>
Jan 26 15:14:45 compute-1 nova_compute[183403]:       </nova:image>
Jan 26 15:14:45 compute-1 nova_compute[183403]:       <nova:owner>
Jan 26 15:14:45 compute-1 nova_compute[183403]:         <nova:user uuid="7936c169fc9442e1865811f8febd438d">tempest-TestExecuteBasicStrategy-1373478536-project-admin</nova:user>
Jan 26 15:14:45 compute-1 nova_compute[183403]:         <nova:project uuid="53b3f69d4de4466089ba08d308b821a1">tempest-TestExecuteBasicStrategy-1373478536</nova:project>
Jan 26 15:14:45 compute-1 nova_compute[183403]:       </nova:owner>
Jan 26 15:14:45 compute-1 nova_compute[183403]:       <nova:root type="image" uuid="354e4d0e-4287-404f-93d3-2c85cfe92fbc"/>
Jan 26 15:14:45 compute-1 nova_compute[183403]:       <nova:ports>
Jan 26 15:14:45 compute-1 nova_compute[183403]:         <nova:port uuid="1cb0a587-a627-48da-a9b1-ab7673e447ef">
Jan 26 15:14:45 compute-1 nova_compute[183403]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 26 15:14:45 compute-1 nova_compute[183403]:         </nova:port>
Jan 26 15:14:45 compute-1 nova_compute[183403]:       </nova:ports>
Jan 26 15:14:45 compute-1 nova_compute[183403]:     </nova:instance>
Jan 26 15:14:45 compute-1 nova_compute[183403]:   </metadata>
Jan 26 15:14:45 compute-1 nova_compute[183403]:   <sysinfo type="smbios">
Jan 26 15:14:45 compute-1 nova_compute[183403]:     <system>
Jan 26 15:14:45 compute-1 nova_compute[183403]:       <entry name="manufacturer">RDO</entry>
Jan 26 15:14:45 compute-1 nova_compute[183403]:       <entry name="product">OpenStack Compute</entry>
Jan 26 15:14:45 compute-1 nova_compute[183403]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 15:14:45 compute-1 nova_compute[183403]:       <entry name="serial">a4414965-214d-4bdb-8c67-d0774c73ff66</entry>
Jan 26 15:14:45 compute-1 nova_compute[183403]:       <entry name="uuid">a4414965-214d-4bdb-8c67-d0774c73ff66</entry>
Jan 26 15:14:45 compute-1 nova_compute[183403]:       <entry name="family">Virtual Machine</entry>
Jan 26 15:14:45 compute-1 nova_compute[183403]:     </system>
Jan 26 15:14:45 compute-1 nova_compute[183403]:   </sysinfo>
Jan 26 15:14:45 compute-1 nova_compute[183403]:   <os>
Jan 26 15:14:45 compute-1 nova_compute[183403]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 15:14:45 compute-1 nova_compute[183403]:     <boot dev="hd"/>
Jan 26 15:14:45 compute-1 nova_compute[183403]:     <smbios mode="sysinfo"/>
Jan 26 15:14:45 compute-1 nova_compute[183403]:   </os>
Jan 26 15:14:45 compute-1 nova_compute[183403]:   <features>
Jan 26 15:14:45 compute-1 nova_compute[183403]:     <acpi/>
Jan 26 15:14:45 compute-1 nova_compute[183403]:     <apic/>
Jan 26 15:14:45 compute-1 nova_compute[183403]:     <vmcoreinfo/>
Jan 26 15:14:45 compute-1 nova_compute[183403]:   </features>
Jan 26 15:14:45 compute-1 nova_compute[183403]:   <clock offset="utc">
Jan 26 15:14:45 compute-1 nova_compute[183403]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 15:14:45 compute-1 nova_compute[183403]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 15:14:45 compute-1 nova_compute[183403]:     <timer name="hpet" present="no"/>
Jan 26 15:14:45 compute-1 nova_compute[183403]:   </clock>
Jan 26 15:14:45 compute-1 nova_compute[183403]:   <cpu mode="custom" match="exact">
Jan 26 15:14:45 compute-1 nova_compute[183403]:     <model>Nehalem</model>
Jan 26 15:14:45 compute-1 nova_compute[183403]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 15:14:45 compute-1 nova_compute[183403]:   </cpu>
Jan 26 15:14:45 compute-1 nova_compute[183403]:   <devices>
Jan 26 15:14:45 compute-1 nova_compute[183403]:     <disk type="file" device="disk">
Jan 26 15:14:45 compute-1 nova_compute[183403]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 15:14:45 compute-1 nova_compute[183403]:       <source file="/var/lib/nova/instances/a4414965-214d-4bdb-8c67-d0774c73ff66/disk"/>
Jan 26 15:14:45 compute-1 nova_compute[183403]:       <target dev="vda" bus="virtio"/>
Jan 26 15:14:45 compute-1 nova_compute[183403]:     </disk>
Jan 26 15:14:45 compute-1 nova_compute[183403]:     <disk type="file" device="cdrom">
Jan 26 15:14:45 compute-1 nova_compute[183403]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 15:14:45 compute-1 nova_compute[183403]:       <source file="/var/lib/nova/instances/a4414965-214d-4bdb-8c67-d0774c73ff66/disk.config"/>
Jan 26 15:14:45 compute-1 nova_compute[183403]:       <target dev="sda" bus="sata"/>
Jan 26 15:14:45 compute-1 nova_compute[183403]:     </disk>
Jan 26 15:14:45 compute-1 nova_compute[183403]:     <interface type="ethernet">
Jan 26 15:14:45 compute-1 nova_compute[183403]:       <mac address="fa:16:3e:e2:90:44"/>
Jan 26 15:14:45 compute-1 nova_compute[183403]:       <model type="virtio"/>
Jan 26 15:14:45 compute-1 nova_compute[183403]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 15:14:45 compute-1 nova_compute[183403]:       <mtu size="1442"/>
Jan 26 15:14:45 compute-1 nova_compute[183403]:       <target dev="tap1cb0a587-a6"/>
Jan 26 15:14:45 compute-1 nova_compute[183403]:     </interface>
Jan 26 15:14:45 compute-1 nova_compute[183403]:     <serial type="pty">
Jan 26 15:14:45 compute-1 nova_compute[183403]:       <log file="/var/lib/nova/instances/a4414965-214d-4bdb-8c67-d0774c73ff66/console.log" append="off"/>
Jan 26 15:14:45 compute-1 nova_compute[183403]:     </serial>
Jan 26 15:14:45 compute-1 nova_compute[183403]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 15:14:45 compute-1 nova_compute[183403]:     <video>
Jan 26 15:14:45 compute-1 nova_compute[183403]:       <model type="virtio"/>
Jan 26 15:14:45 compute-1 nova_compute[183403]:     </video>
Jan 26 15:14:45 compute-1 nova_compute[183403]:     <input type="tablet" bus="usb"/>
Jan 26 15:14:45 compute-1 nova_compute[183403]:     <rng model="virtio">
Jan 26 15:14:45 compute-1 nova_compute[183403]:       <backend model="random">/dev/urandom</backend>
Jan 26 15:14:45 compute-1 nova_compute[183403]:     </rng>
Jan 26 15:14:45 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root"/>
Jan 26 15:14:45 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:14:45 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:14:45 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:14:45 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:14:45 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:14:45 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:14:45 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:14:45 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:14:45 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:14:45 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:14:45 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:14:45 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:14:45 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:14:45 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:14:45 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:14:45 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:14:45 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:14:45 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:14:45 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:14:45 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:14:45 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:14:45 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:14:45 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:14:45 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:14:45 compute-1 nova_compute[183403]:     <controller type="usb" index="0"/>
Jan 26 15:14:45 compute-1 nova_compute[183403]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 15:14:45 compute-1 nova_compute[183403]:       <stats period="10"/>
Jan 26 15:14:45 compute-1 nova_compute[183403]:     </memballoon>
Jan 26 15:14:45 compute-1 nova_compute[183403]:   </devices>
Jan 26 15:14:45 compute-1 nova_compute[183403]: </domain>
Jan 26 15:14:45 compute-1 nova_compute[183403]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Jan 26 15:14:45 compute-1 nova_compute[183403]: 2026-01-26 15:14:45.954 183407 DEBUG nova.compute.manager [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Preparing to wait for external event network-vif-plugged-1cb0a587-a627-48da-a9b1-ab7673e447ef prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 26 15:14:45 compute-1 nova_compute[183403]: 2026-01-26 15:14:45.954 183407 DEBUG oslo_concurrency.lockutils [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] Acquiring lock "a4414965-214d-4bdb-8c67-d0774c73ff66-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:14:45 compute-1 nova_compute[183403]: 2026-01-26 15:14:45.954 183407 DEBUG oslo_concurrency.lockutils [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] Lock "a4414965-214d-4bdb-8c67-d0774c73ff66-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:14:45 compute-1 nova_compute[183403]: 2026-01-26 15:14:45.955 183407 DEBUG oslo_concurrency.lockutils [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] Lock "a4414965-214d-4bdb-8c67-d0774c73ff66-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:14:45 compute-1 nova_compute[183403]: 2026-01-26 15:14:45.955 183407 DEBUG nova.virt.libvirt.vif [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-01-26T15:14:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-919966140',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-919966140',id=11,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='53b3f69d4de4466089ba08d308b821a1',ramdisk_id='',reservation_id='r-y70rxn4m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,reader,admin,member',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteBasicStrategy-1373478536',owner_user_name='tempest-TestExecuteBasicStrategy-1373478536-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T15:14:39Z,user_data=None,user_id='7936c169fc9442e1865811f8febd438d',uuid=a4414965-214d-4bdb-8c67-d0774c73ff66,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1cb0a587-a627-48da-a9b1-ab7673e447ef", "address": "fa:16:3e:e2:90:44", "network": {"id": "95b60e5a-ede2-4bc6-80f8-81c875c613ca", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-413334478-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1eff4cc862a94ba0b9bdd2e7cd089d3b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1cb0a587-a6", "ovs_interfaceid": "1cb0a587-a627-48da-a9b1-ab7673e447ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 26 15:14:45 compute-1 nova_compute[183403]: 2026-01-26 15:14:45.955 183407 DEBUG nova.network.os_vif_util [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] Converting VIF {"id": "1cb0a587-a627-48da-a9b1-ab7673e447ef", "address": "fa:16:3e:e2:90:44", "network": {"id": "95b60e5a-ede2-4bc6-80f8-81c875c613ca", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-413334478-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1eff4cc862a94ba0b9bdd2e7cd089d3b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1cb0a587-a6", "ovs_interfaceid": "1cb0a587-a627-48da-a9b1-ab7673e447ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:14:45 compute-1 nova_compute[183403]: 2026-01-26 15:14:45.956 183407 DEBUG nova.network.os_vif_util [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e2:90:44,bridge_name='br-int',has_traffic_filtering=True,id=1cb0a587-a627-48da-a9b1-ab7673e447ef,network=Network(95b60e5a-ede2-4bc6-80f8-81c875c613ca),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1cb0a587-a6') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:14:45 compute-1 nova_compute[183403]: 2026-01-26 15:14:45.956 183407 DEBUG os_vif [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e2:90:44,bridge_name='br-int',has_traffic_filtering=True,id=1cb0a587-a627-48da-a9b1-ab7673e447ef,network=Network(95b60e5a-ede2-4bc6-80f8-81c875c613ca),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1cb0a587-a6') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 26 15:14:45 compute-1 nova_compute[183403]: 2026-01-26 15:14:45.957 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:14:45 compute-1 nova_compute[183403]: 2026-01-26 15:14:45.958 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:14:45 compute-1 nova_compute[183403]: 2026-01-26 15:14:45.959 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:14:45 compute-1 nova_compute[183403]: 2026-01-26 15:14:45.960 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:14:45 compute-1 nova_compute[183403]: 2026-01-26 15:14:45.960 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'bf09513b-b80d-5d20-83ab-df22b08fd31f', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:14:46 compute-1 nova_compute[183403]: 2026-01-26 15:14:46.016 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:14:46 compute-1 nova_compute[183403]: 2026-01-26 15:14:46.019 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:14:46 compute-1 nova_compute[183403]: 2026-01-26 15:14:46.024 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:14:46 compute-1 nova_compute[183403]: 2026-01-26 15:14:46.025 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1cb0a587-a6, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:14:46 compute-1 nova_compute[183403]: 2026-01-26 15:14:46.025 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap1cb0a587-a6, col_values=(('qos', UUID('c79c5b9b-6fea-4215-bf29-0dcdddfe214e')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:14:46 compute-1 nova_compute[183403]: 2026-01-26 15:14:46.025 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap1cb0a587-a6, col_values=(('external_ids', {'iface-id': '1cb0a587-a627-48da-a9b1-ab7673e447ef', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e2:90:44', 'vm-uuid': 'a4414965-214d-4bdb-8c67-d0774c73ff66'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:14:46 compute-1 nova_compute[183403]: 2026-01-26 15:14:46.026 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:14:46 compute-1 NetworkManager[55716]: <info>  [1769440486.0273] manager: (tap1cb0a587-a6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/39)
Jan 26 15:14:46 compute-1 nova_compute[183403]: 2026-01-26 15:14:46.029 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:14:46 compute-1 nova_compute[183403]: 2026-01-26 15:14:46.033 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:14:46 compute-1 nova_compute[183403]: 2026-01-26 15:14:46.034 183407 INFO os_vif [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e2:90:44,bridge_name='br-int',has_traffic_filtering=True,id=1cb0a587-a627-48da-a9b1-ab7673e447ef,network=Network(95b60e5a-ede2-4bc6-80f8-81c875c613ca),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1cb0a587-a6')
Jan 26 15:14:47 compute-1 nova_compute[183403]: 2026-01-26 15:14:47.838 183407 DEBUG nova.virt.libvirt.driver [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 15:14:47 compute-1 nova_compute[183403]: 2026-01-26 15:14:47.839 183407 DEBUG nova.virt.libvirt.driver [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 15:14:47 compute-1 nova_compute[183403]: 2026-01-26 15:14:47.840 183407 DEBUG nova.virt.libvirt.driver [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] No VIF found with MAC fa:16:3e:e2:90:44, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Jan 26 15:14:47 compute-1 nova_compute[183403]: 2026-01-26 15:14:47.840 183407 INFO nova.virt.libvirt.driver [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Using config drive
Jan 26 15:14:48 compute-1 nova_compute[183403]: 2026-01-26 15:14:48.430 183407 WARNING neutronclient.v2_0.client [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:14:49 compute-1 openstack_network_exporter[195610]: ERROR   15:14:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:14:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:14:49 compute-1 openstack_network_exporter[195610]: ERROR   15:14:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:14:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:14:49 compute-1 nova_compute[183403]: 2026-01-26 15:14:49.615 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:14:51 compute-1 nova_compute[183403]: 2026-01-26 15:14:51.028 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:14:51 compute-1 nova_compute[183403]: 2026-01-26 15:14:51.789 183407 INFO nova.virt.libvirt.driver [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Creating config drive at /var/lib/nova/instances/a4414965-214d-4bdb-8c67-d0774c73ff66/disk.config
Jan 26 15:14:51 compute-1 nova_compute[183403]: 2026-01-26 15:14:51.794 183407 DEBUG oslo_concurrency.processutils [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a4414965-214d-4bdb-8c67-d0774c73ff66/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp0abtk0ez execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:14:51 compute-1 nova_compute[183403]: 2026-01-26 15:14:51.925 183407 DEBUG oslo_concurrency.processutils [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a4414965-214d-4bdb-8c67-d0774c73ff66/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp0abtk0ez" returned: 0 in 0.130s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:14:51 compute-1 kernel: tap1cb0a587-a6: entered promiscuous mode
Jan 26 15:14:51 compute-1 NetworkManager[55716]: <info>  [1769440491.9964] manager: (tap1cb0a587-a6): new Tun device (/org/freedesktop/NetworkManager/Devices/40)
Jan 26 15:14:52 compute-1 nova_compute[183403]: 2026-01-26 15:14:52.000 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:14:52 compute-1 ovn_controller[95641]: 2026-01-26T15:14:51Z|00095|binding|INFO|Claiming lport 1cb0a587-a627-48da-a9b1-ab7673e447ef for this chassis.
Jan 26 15:14:52 compute-1 ovn_controller[95641]: 2026-01-26T15:14:51Z|00096|binding|INFO|1cb0a587-a627-48da-a9b1-ab7673e447ef: Claiming fa:16:3e:e2:90:44 10.100.0.5
Jan 26 15:14:52 compute-1 nova_compute[183403]: 2026-01-26 15:14:52.005 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:14:52 compute-1 systemd-udevd[207428]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:14:52.035 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e2:90:44 10.100.0.5'], port_security=['fa:16:3e:e2:90:44 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'a4414965-214d-4bdb-8c67-d0774c73ff66', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-95b60e5a-ede2-4bc6-80f8-81c875c613ca', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '53b3f69d4de4466089ba08d308b821a1', 'neutron:revision_number': '4', 'neutron:security_group_ids': '4f9e494b-2632-4b9d-8b3a-97aadbce4827', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=078a9aa2-ec53-4a6e-9fce-171c59eb7b8d, chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], logical_port=1cb0a587-a627-48da-a9b1-ab7673e447ef) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:14:52.037 104930 INFO neutron.agent.ovn.metadata.agent [-] Port 1cb0a587-a627-48da-a9b1-ab7673e447ef in datapath 95b60e5a-ede2-4bc6-80f8-81c875c613ca bound to our chassis
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:14:52.038 104930 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 95b60e5a-ede2-4bc6-80f8-81c875c613ca
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:14:52.048 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[220286c2-b1f6-417f-8983-55af76a04ffe]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:14:52.049 104930 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap95b60e5a-e1 in ovnmeta-95b60e5a-ede2-4bc6-80f8-81c875c613ca namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Jan 26 15:14:52 compute-1 NetworkManager[55716]: <info>  [1769440492.0506] device (tap1cb0a587-a6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:14:52.051 203506 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap95b60e5a-e0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:14:52.051 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[03090e46-0af2-4bd9-bf38-a0d5a703aa46]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:14:52.051 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[8892e46b-e3c4-402d-b857-53d2b7f40082]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:14:52 compute-1 NetworkManager[55716]: <info>  [1769440492.0526] device (tap1cb0a587-a6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 15:14:52 compute-1 systemd-machined[154697]: New machine qemu-8-instance-0000000b.
Jan 26 15:14:52 compute-1 nova_compute[183403]: 2026-01-26 15:14:52.055 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:14:52 compute-1 ovn_controller[95641]: 2026-01-26T15:14:52Z|00097|binding|INFO|Setting lport 1cb0a587-a627-48da-a9b1-ab7673e447ef ovn-installed in OVS
Jan 26 15:14:52 compute-1 ovn_controller[95641]: 2026-01-26T15:14:52Z|00098|binding|INFO|Setting lport 1cb0a587-a627-48da-a9b1-ab7673e447ef up in Southbound
Jan 26 15:14:52 compute-1 nova_compute[183403]: 2026-01-26 15:14:52.059 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:14:52 compute-1 systemd[1]: Started Virtual Machine qemu-8-instance-0000000b.
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:14:52.066 105448 DEBUG oslo.privsep.daemon [-] privsep: reply[24bd44d8-8ffa-4ca2-92e0-23e215d8b3f2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:14:52.082 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[4fde4866-4ca4-42bb-816b-7cdcaa97082e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:14:52.115 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[fa011e5f-a213-4a25-9b47-7d806ce58fec]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:14:52.121 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[030bc214-87c7-40ef-a156-7037a1638976]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:14:52 compute-1 NetworkManager[55716]: <info>  [1769440492.1224] manager: (tap95b60e5a-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/41)
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:14:52.156 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[4b0be843-19de-442a-8fb2-7acd1b5f9c86]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:14:52.159 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[2f664485-dde6-49fc-bba6-8914e41764b0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:14:52 compute-1 NetworkManager[55716]: <info>  [1769440492.1901] device (tap95b60e5a-e0): carrier: link connected
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:14:52.194 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[67600cdf-f7a7-4556-b42e-f330009fef85]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:14:52.217 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[6db0c075-c81b-48af-860e-1548ece46925]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap95b60e5a-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bd:09:69'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 421479, 'reachable_time': 15271, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 207462, 'error': None, 'target': 'ovnmeta-95b60e5a-ede2-4bc6-80f8-81c875c613ca', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:14:52.233 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[834b0462-9990-435b-9290-aa45ae72dbc4]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febd:969'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 421479, 'tstamp': 421479}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 207463, 'error': None, 'target': 'ovnmeta-95b60e5a-ede2-4bc6-80f8-81c875c613ca', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:14:52.253 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[6021aad7-df6b-4bdb-bf73-49a0d8c4b0e0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap95b60e5a-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bd:09:69'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 421479, 'reachable_time': 15271, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 207464, 'error': None, 'target': 'ovnmeta-95b60e5a-ede2-4bc6-80f8-81c875c613ca', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:14:52.286 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[c84dedf3-f6f1-4c4a-8b0d-a291c94f0fc3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:14:52 compute-1 nova_compute[183403]: 2026-01-26 15:14:52.297 183407 DEBUG nova.compute.manager [req-ad2da7a9-e67a-4dd2-961f-13ed47f8a6bc req-5ff11e87-b27f-4ef6-aaca-fad5cd931222 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Received event network-vif-plugged-1cb0a587-a627-48da-a9b1-ab7673e447ef external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:14:52 compute-1 nova_compute[183403]: 2026-01-26 15:14:52.297 183407 DEBUG oslo_concurrency.lockutils [req-ad2da7a9-e67a-4dd2-961f-13ed47f8a6bc req-5ff11e87-b27f-4ef6-aaca-fad5cd931222 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "a4414965-214d-4bdb-8c67-d0774c73ff66-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:14:52 compute-1 nova_compute[183403]: 2026-01-26 15:14:52.298 183407 DEBUG oslo_concurrency.lockutils [req-ad2da7a9-e67a-4dd2-961f-13ed47f8a6bc req-5ff11e87-b27f-4ef6-aaca-fad5cd931222 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "a4414965-214d-4bdb-8c67-d0774c73ff66-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:14:52 compute-1 nova_compute[183403]: 2026-01-26 15:14:52.298 183407 DEBUG oslo_concurrency.lockutils [req-ad2da7a9-e67a-4dd2-961f-13ed47f8a6bc req-5ff11e87-b27f-4ef6-aaca-fad5cd931222 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "a4414965-214d-4bdb-8c67-d0774c73ff66-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:14:52 compute-1 nova_compute[183403]: 2026-01-26 15:14:52.298 183407 DEBUG nova.compute.manager [req-ad2da7a9-e67a-4dd2-961f-13ed47f8a6bc req-5ff11e87-b27f-4ef6-aaca-fad5cd931222 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Processing event network-vif-plugged-1cb0a587-a627-48da-a9b1-ab7673e447ef _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:14:52.365 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[ebabb826-5b7d-4f4b-9aa6-ba9900887ab7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:14:52.367 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap95b60e5a-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:14:52.367 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:14:52.367 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap95b60e5a-e0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:14:52 compute-1 nova_compute[183403]: 2026-01-26 15:14:52.369 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:14:52 compute-1 kernel: tap95b60e5a-e0: entered promiscuous mode
Jan 26 15:14:52 compute-1 NetworkManager[55716]: <info>  [1769440492.3705] manager: (tap95b60e5a-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/42)
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:14:52.373 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap95b60e5a-e0, col_values=(('external_ids', {'iface-id': 'f8cc5f9d-b902-4d9f-8c23-350733c66989'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:14:52 compute-1 nova_compute[183403]: 2026-01-26 15:14:52.374 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:14:52 compute-1 ovn_controller[95641]: 2026-01-26T15:14:52Z|00099|binding|INFO|Releasing lport f8cc5f9d-b902-4d9f-8c23-350733c66989 from this chassis (sb_readonly=0)
Jan 26 15:14:52 compute-1 nova_compute[183403]: 2026-01-26 15:14:52.374 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:14:52.376 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[b4728e3e-81a8-442c-8807-a2f11cce7db9]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:14:52.377 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/95b60e5a-ede2-4bc6-80f8-81c875c613ca.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/95b60e5a-ede2-4bc6-80f8-81c875c613ca.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:14:52.377 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/95b60e5a-ede2-4bc6-80f8-81c875c613ca.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/95b60e5a-ede2-4bc6-80f8-81c875c613ca.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:14:52.377 104930 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 95b60e5a-ede2-4bc6-80f8-81c875c613ca disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:14:52.377 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/95b60e5a-ede2-4bc6-80f8-81c875c613ca.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/95b60e5a-ede2-4bc6-80f8-81c875c613ca.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:14:52.378 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[8fdc94aa-deb7-47a6-9985-a01b1a711fc5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:14:52.378 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/95b60e5a-ede2-4bc6-80f8-81c875c613ca.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/95b60e5a-ede2-4bc6-80f8-81c875c613ca.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:14:52.378 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[905bc846-b6db-407b-bf78-4f6288ab44dc]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:14:52.379 104930 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]: global
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]:     log         /dev/log local0 debug
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]:     log-tag     haproxy-metadata-proxy-95b60e5a-ede2-4bc6-80f8-81c875c613ca
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]:     user        root
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]:     group       root
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]:     maxconn     1024
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]:     pidfile     /var/lib/neutron/external/pids/95b60e5a-ede2-4bc6-80f8-81c875c613ca.pid.haproxy
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]:     daemon
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]: 
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]: defaults
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]:     log global
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]:     mode http
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]:     option httplog
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]:     option dontlognull
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]:     option http-server-close
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]:     option forwardfor
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]:     retries                 3
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]:     timeout http-request    30s
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]:     timeout connect         30s
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]:     timeout client          32s
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]:     timeout server          32s
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]:     timeout http-keep-alive 30s
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]: 
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]: listen listener
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]:     bind 169.254.169.254:80
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]:     
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]:     server metadata /var/lib/neutron/metadata_proxy
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]: 
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]:     http-request add-header X-OVN-Network-ID 95b60e5a-ede2-4bc6-80f8-81c875c613ca
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:14:52.379 104930 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-95b60e5a-ede2-4bc6-80f8-81c875c613ca', 'env', 'PROCESS_TAG=haproxy-95b60e5a-ede2-4bc6-80f8-81c875c613ca', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/95b60e5a-ede2-4bc6-80f8-81c875c613ca.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Jan 26 15:14:52 compute-1 nova_compute[183403]: 2026-01-26 15:14:52.396 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:14:52 compute-1 nova_compute[183403]: 2026-01-26 15:14:52.788 183407 DEBUG nova.compute.manager [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 26 15:14:52 compute-1 nova_compute[183403]: 2026-01-26 15:14:52.794 183407 DEBUG nova.virt.libvirt.driver [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Jan 26 15:14:52 compute-1 nova_compute[183403]: 2026-01-26 15:14:52.797 183407 INFO nova.virt.libvirt.driver [-] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Instance spawned successfully.
Jan 26 15:14:52 compute-1 nova_compute[183403]: 2026-01-26 15:14:52.798 183407 DEBUG nova.virt.libvirt.driver [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Jan 26 15:14:52 compute-1 podman[207502]: 2026-01-26 15:14:52.800731993 +0000 UTC m=+0.062208611 container create b5e4cd5a9117946b410724334684aa025f8b490836776ebbe20ba9e85e0caa3b (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-95b60e5a-ede2-4bc6-80f8-81c875c613ca, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260120, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS)
Jan 26 15:14:52 compute-1 systemd[1]: Started libpod-conmon-b5e4cd5a9117946b410724334684aa025f8b490836776ebbe20ba9e85e0caa3b.scope.
Jan 26 15:14:52 compute-1 podman[207502]: 2026-01-26 15:14:52.760460688 +0000 UTC m=+0.021937286 image pull d5bf96c5225682608353c2a38183b39c74c7c48343b54a579b3b6f3d81996637 38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Jan 26 15:14:52 compute-1 systemd[1]: Started libcrun container.
Jan 26 15:14:52 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4e453138fdea51e8c720e039034c340eb73742d8e963831191fad92a481fdf06/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 15:14:52 compute-1 podman[207502]: 2026-01-26 15:14:52.90451884 +0000 UTC m=+0.165995508 container init b5e4cd5a9117946b410724334684aa025f8b490836776ebbe20ba9e85e0caa3b (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-95b60e5a-ede2-4bc6-80f8-81c875c613ca, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260120, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 26 15:14:52 compute-1 podman[207502]: 2026-01-26 15:14:52.91091239 +0000 UTC m=+0.172389008 container start b5e4cd5a9117946b410724334684aa025f8b490836776ebbe20ba9e85e0caa3b (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-95b60e5a-ede2-4bc6-80f8-81c875c613ca, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260120)
Jan 26 15:14:52 compute-1 neutron-haproxy-ovnmeta-95b60e5a-ede2-4bc6-80f8-81c875c613ca[207517]: [NOTICE]   (207521) : New worker (207523) forked
Jan 26 15:14:52 compute-1 neutron-haproxy-ovnmeta-95b60e5a-ede2-4bc6-80f8-81c875c613ca[207517]: [NOTICE]   (207521) : Loading success.
Jan 26 15:14:52 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:14:52.977 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:ac:38', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'de:96:40:90:f3:e5'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:14:52 compute-1 nova_compute[183403]: 2026-01-26 15:14:52.978 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:14:53 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:14:53.001 104930 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 15:14:53 compute-1 nova_compute[183403]: 2026-01-26 15:14:53.350 183407 DEBUG nova.virt.libvirt.driver [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:14:53 compute-1 nova_compute[183403]: 2026-01-26 15:14:53.351 183407 DEBUG nova.virt.libvirt.driver [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:14:53 compute-1 nova_compute[183403]: 2026-01-26 15:14:53.352 183407 DEBUG nova.virt.libvirt.driver [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:14:53 compute-1 nova_compute[183403]: 2026-01-26 15:14:53.352 183407 DEBUG nova.virt.libvirt.driver [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:14:53 compute-1 nova_compute[183403]: 2026-01-26 15:14:53.353 183407 DEBUG nova.virt.libvirt.driver [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:14:53 compute-1 nova_compute[183403]: 2026-01-26 15:14:53.353 183407 DEBUG nova.virt.libvirt.driver [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:14:54 compute-1 nova_compute[183403]: 2026-01-26 15:14:54.355 183407 DEBUG nova.compute.manager [req-5fc3250d-1058-495f-87cc-a72c2daca487 req-8cd58a16-8e3b-44ff-876b-9cb5b6a16d86 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Received event network-vif-plugged-1cb0a587-a627-48da-a9b1-ab7673e447ef external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:14:54 compute-1 nova_compute[183403]: 2026-01-26 15:14:54.355 183407 DEBUG oslo_concurrency.lockutils [req-5fc3250d-1058-495f-87cc-a72c2daca487 req-8cd58a16-8e3b-44ff-876b-9cb5b6a16d86 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "a4414965-214d-4bdb-8c67-d0774c73ff66-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:14:54 compute-1 nova_compute[183403]: 2026-01-26 15:14:54.355 183407 DEBUG oslo_concurrency.lockutils [req-5fc3250d-1058-495f-87cc-a72c2daca487 req-8cd58a16-8e3b-44ff-876b-9cb5b6a16d86 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "a4414965-214d-4bdb-8c67-d0774c73ff66-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:14:54 compute-1 nova_compute[183403]: 2026-01-26 15:14:54.356 183407 DEBUG oslo_concurrency.lockutils [req-5fc3250d-1058-495f-87cc-a72c2daca487 req-8cd58a16-8e3b-44ff-876b-9cb5b6a16d86 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "a4414965-214d-4bdb-8c67-d0774c73ff66-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:14:54 compute-1 nova_compute[183403]: 2026-01-26 15:14:54.356 183407 DEBUG nova.compute.manager [req-5fc3250d-1058-495f-87cc-a72c2daca487 req-8cd58a16-8e3b-44ff-876b-9cb5b6a16d86 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] No waiting events found dispatching network-vif-plugged-1cb0a587-a627-48da-a9b1-ab7673e447ef pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:14:54 compute-1 nova_compute[183403]: 2026-01-26 15:14:54.356 183407 WARNING nova.compute.manager [req-5fc3250d-1058-495f-87cc-a72c2daca487 req-8cd58a16-8e3b-44ff-876b-9cb5b6a16d86 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Received unexpected event network-vif-plugged-1cb0a587-a627-48da-a9b1-ab7673e447ef for instance with vm_state building and task_state spawning.
Jan 26 15:14:54 compute-1 nova_compute[183403]: 2026-01-26 15:14:54.368 183407 INFO nova.compute.manager [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Took 13.72 seconds to spawn the instance on the hypervisor.
Jan 26 15:14:54 compute-1 nova_compute[183403]: 2026-01-26 15:14:54.369 183407 DEBUG nova.compute.manager [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Jan 26 15:14:54 compute-1 nova_compute[183403]: 2026-01-26 15:14:54.616 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:14:54 compute-1 nova_compute[183403]: 2026-01-26 15:14:54.963 183407 INFO nova.compute.manager [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Took 19.05 seconds to build instance.
Jan 26 15:14:55 compute-1 nova_compute[183403]: 2026-01-26 15:14:55.472 183407 DEBUG oslo_concurrency.lockutils [None req-f68dd6ce-1134-49e8-ab24-f20595516e41 7936c169fc9442e1865811f8febd438d 53b3f69d4de4466089ba08d308b821a1 - - default default] Lock "a4414965-214d-4bdb-8c67-d0774c73ff66" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 20.569s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:14:56 compute-1 nova_compute[183403]: 2026-01-26 15:14:56.085 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:14:56 compute-1 sshd-session[207406]: Invalid user cq from 185.246.128.170 port 57028
Jan 26 15:14:56 compute-1 podman[207534]: 2026-01-26 15:14:56.570300146 +0000 UTC m=+0.057741207 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., version=9.6, name=ubi9-minimal, architecture=x86_64, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, managed_by=edpm_ansible, distribution-scope=public, config_id=openstack_network_exporter, release=1755695350, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=)
Jan 26 15:14:56 compute-1 podman[207533]: 2026-01-26 15:14:56.603147717 +0000 UTC m=+0.085153741 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 15:14:58 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:14:58.002 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=41380b5a-e321-4ce4-bcc6-ecd563b3c793, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:14:58 compute-1 sshd-session[207406]: Disconnecting invalid user cq 185.246.128.170 port 57028: Change of username or service not allowed: (cq,ssh-connection) -> (sftp,ssh-connection) [preauth]
Jan 26 15:14:59 compute-1 nova_compute[183403]: 2026-01-26 15:14:59.619 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:15:01 compute-1 nova_compute[183403]: 2026-01-26 15:15:01.088 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:15:03 compute-1 nova_compute[183403]: 2026-01-26 15:15:03.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:15:04 compute-1 ovn_controller[95641]: 2026-01-26T15:15:04Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e2:90:44 10.100.0.5
Jan 26 15:15:04 compute-1 ovn_controller[95641]: 2026-01-26T15:15:04Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e2:90:44 10.100.0.5
Jan 26 15:15:04 compute-1 nova_compute[183403]: 2026-01-26 15:15:04.620 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:15:05 compute-1 nova_compute[183403]: 2026-01-26 15:15:05.584 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:15:05 compute-1 podman[192725]: time="2026-01-26T15:15:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:15:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:15:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16575 "" "Go-http-client/1.1"
Jan 26 15:15:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:15:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2649 "" "Go-http-client/1.1"
Jan 26 15:15:05 compute-1 podman[207601]: 2026-01-26 15:15:05.9517635 +0000 UTC m=+0.107776892 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 26 15:15:05 compute-1 podman[207600]: 2026-01-26 15:15:05.998931557 +0000 UTC m=+0.160834587 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20260120, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 26 15:15:06 compute-1 nova_compute[183403]: 2026-01-26 15:15:06.090 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:15:06 compute-1 nova_compute[183403]: 2026-01-26 15:15:06.107 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:15:06 compute-1 nova_compute[183403]: 2026-01-26 15:15:06.108 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:15:06 compute-1 nova_compute[183403]: 2026-01-26 15:15:06.108 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:15:06 compute-1 nova_compute[183403]: 2026-01-26 15:15:06.108 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:15:07 compute-1 nova_compute[183403]: 2026-01-26 15:15:07.175 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a4414965-214d-4bdb-8c67-d0774c73ff66/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:15:07 compute-1 nova_compute[183403]: 2026-01-26 15:15:07.260 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a4414965-214d-4bdb-8c67-d0774c73ff66/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:15:07 compute-1 nova_compute[183403]: 2026-01-26 15:15:07.262 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a4414965-214d-4bdb-8c67-d0774c73ff66/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:15:07 compute-1 nova_compute[183403]: 2026-01-26 15:15:07.323 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a4414965-214d-4bdb-8c67-d0774c73ff66/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:15:07 compute-1 nova_compute[183403]: 2026-01-26 15:15:07.509 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:15:07 compute-1 nova_compute[183403]: 2026-01-26 15:15:07.513 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:15:07 compute-1 nova_compute[183403]: 2026-01-26 15:15:07.550 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.037s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:15:07 compute-1 nova_compute[183403]: 2026-01-26 15:15:07.552 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5669MB free_disk=73.11923217773438GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:15:07 compute-1 nova_compute[183403]: 2026-01-26 15:15:07.552 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:15:07 compute-1 nova_compute[183403]: 2026-01-26 15:15:07.553 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:15:08 compute-1 sshd-session[207581]: Invalid user sftp from 185.246.128.170 port 58743
Jan 26 15:15:08 compute-1 nova_compute[183403]: 2026-01-26 15:15:08.624 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Instance a4414965-214d-4bdb-8c67-d0774c73ff66 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 15:15:08 compute-1 nova_compute[183403]: 2026-01-26 15:15:08.625 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:15:08 compute-1 nova_compute[183403]: 2026-01-26 15:15:08.625 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:15:07 up  1:10,  0 user,  load average: 0.28, 0.26, 0.35\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_53b3f69d4de4466089ba08d308b821a1': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:15:08 compute-1 sshd-session[207581]: Disconnecting invalid user sftp 185.246.128.170 port 58743: Change of username or service not allowed: (sftp,ssh-connection) -> (mohamed,ssh-connection) [preauth]
Jan 26 15:15:08 compute-1 nova_compute[183403]: 2026-01-26 15:15:08.668 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:15:09 compute-1 nova_compute[183403]: 2026-01-26 15:15:09.178 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:15:09 compute-1 nova_compute[183403]: 2026-01-26 15:15:09.624 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:15:09 compute-1 nova_compute[183403]: 2026-01-26 15:15:09.690 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:15:09 compute-1 nova_compute[183403]: 2026-01-26 15:15:09.691 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.138s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:15:11 compute-1 nova_compute[183403]: 2026-01-26 15:15:11.138 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:15:11 compute-1 nova_compute[183403]: 2026-01-26 15:15:11.684 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:15:11 compute-1 nova_compute[183403]: 2026-01-26 15:15:11.684 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:15:11 compute-1 nova_compute[183403]: 2026-01-26 15:15:11.684 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:15:11 compute-1 nova_compute[183403]: 2026-01-26 15:15:11.685 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:15:11 compute-1 nova_compute[183403]: 2026-01-26 15:15:11.685 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 15:15:12 compute-1 nova_compute[183403]: 2026-01-26 15:15:12.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:15:13 compute-1 sshd-session[207652]: Invalid user mohamed from 185.246.128.170 port 21086
Jan 26 15:15:13 compute-1 nova_compute[183403]: 2026-01-26 15:15:13.573 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:15:13 compute-1 sshd-session[207652]: Disconnecting invalid user mohamed 185.246.128.170 port 21086: Change of username or service not allowed: (mohamed,ssh-connection) -> (ftp,ssh-connection) [preauth]
Jan 26 15:15:14 compute-1 nova_compute[183403]: 2026-01-26 15:15:14.626 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:15:16 compute-1 nova_compute[183403]: 2026-01-26 15:15:16.141 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:15:17 compute-1 nova_compute[183403]: 2026-01-26 15:15:17.608 183407 DEBUG nova.compute.manager [None req-e6f74d58-bfb5-4027-9288-4c2dba7a7993 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:636
Jan 26 15:15:17 compute-1 nova_compute[183403]: 2026-01-26 15:15:17.668 183407 DEBUG nova.compute.provider_tree [None req-e6f74d58-bfb5-4027-9288-4c2dba7a7993 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Updating resource provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 generation from 10 to 12 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Jan 26 15:15:19 compute-1 openstack_network_exporter[195610]: ERROR   15:15:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:15:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:15:19 compute-1 openstack_network_exporter[195610]: ERROR   15:15:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:15:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:15:19 compute-1 nova_compute[183403]: 2026-01-26 15:15:19.629 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:15:21 compute-1 nova_compute[183403]: 2026-01-26 15:15:21.143 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:15:21 compute-1 sshd-session[207654]: Disconnecting authenticating user ftp 185.246.128.170 port 44995: Change of username or service not allowed: (ftp,ssh-connection) -> (vhserver,ssh-connection) [preauth]
Jan 26 15:15:22 compute-1 ovn_controller[95641]: 2026-01-26T15:15:22Z|00100|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Jan 26 15:15:24 compute-1 nova_compute[183403]: 2026-01-26 15:15:24.632 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:15:26 compute-1 nova_compute[183403]: 2026-01-26 15:15:26.146 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:15:26 compute-1 podman[207658]: 2026-01-26 15:15:26.8773083 +0000 UTC m=+0.050553091 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 15:15:26 compute-1 podman[207659]: 2026-01-26 15:15:26.887434152 +0000 UTC m=+0.058410117 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, vcs-type=git, vendor=Red Hat, Inc., config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, distribution-scope=public, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, io.openshift.expose-services=, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, release=1755695350, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 26 15:15:27 compute-1 nova_compute[183403]: 2026-01-26 15:15:27.281 183407 DEBUG nova.virt.libvirt.driver [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Check if temp file /var/lib/nova/instances/tmpoey4ybrf exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Jan 26 15:15:27 compute-1 nova_compute[183403]: 2026-01-26 15:15:27.286 183407 DEBUG nova.compute.manager [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpoey4ybrf',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='a4414965-214d-4bdb-8c67-d0774c73ff66',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9298
Jan 26 15:15:28 compute-1 sshd-session[207656]: Invalid user vhserver from 185.246.128.170 port 25779
Jan 26 15:15:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:15:29.042 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:15:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:15:29.043 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:15:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:15:29.043 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:15:29 compute-1 nova_compute[183403]: 2026-01-26 15:15:29.634 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:15:31 compute-1 nova_compute[183403]: 2026-01-26 15:15:31.505 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:15:33 compute-1 nova_compute[183403]: 2026-01-26 15:15:33.552 183407 DEBUG oslo_concurrency.processutils [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a4414965-214d-4bdb-8c67-d0774c73ff66/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:15:33 compute-1 nova_compute[183403]: 2026-01-26 15:15:33.630 183407 DEBUG oslo_concurrency.processutils [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a4414965-214d-4bdb-8c67-d0774c73ff66/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:15:33 compute-1 nova_compute[183403]: 2026-01-26 15:15:33.631 183407 DEBUG oslo_concurrency.processutils [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a4414965-214d-4bdb-8c67-d0774c73ff66/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:15:33 compute-1 nova_compute[183403]: 2026-01-26 15:15:33.689 183407 DEBUG oslo_concurrency.processutils [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a4414965-214d-4bdb-8c67-d0774c73ff66/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:15:33 compute-1 nova_compute[183403]: 2026-01-26 15:15:33.690 183407 DEBUG nova.compute.manager [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Preparing to wait for external event network-vif-plugged-1cb0a587-a627-48da-a9b1-ab7673e447ef prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 26 15:15:33 compute-1 nova_compute[183403]: 2026-01-26 15:15:33.690 183407 DEBUG oslo_concurrency.lockutils [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "a4414965-214d-4bdb-8c67-d0774c73ff66-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:15:33 compute-1 nova_compute[183403]: 2026-01-26 15:15:33.691 183407 DEBUG oslo_concurrency.lockutils [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "a4414965-214d-4bdb-8c67-d0774c73ff66-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:15:33 compute-1 nova_compute[183403]: 2026-01-26 15:15:33.691 183407 DEBUG oslo_concurrency.lockutils [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "a4414965-214d-4bdb-8c67-d0774c73ff66-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:15:33 compute-1 sshd-session[207656]: Disconnecting invalid user vhserver 185.246.128.170 port 25779: Change of username or service not allowed: (vhserver,ssh-connection) -> (alan,ssh-connection) [preauth]
Jan 26 15:15:34 compute-1 nova_compute[183403]: 2026-01-26 15:15:34.637 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:15:35 compute-1 podman[192725]: time="2026-01-26T15:15:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:15:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:15:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16575 "" "Go-http-client/1.1"
Jan 26 15:15:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:15:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2656 "" "Go-http-client/1.1"
Jan 26 15:15:36 compute-1 nova_compute[183403]: 2026-01-26 15:15:36.509 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:15:36 compute-1 podman[207709]: 2026-01-26 15:15:36.890263583 +0000 UTC m=+0.067567823 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260120)
Jan 26 15:15:36 compute-1 podman[207708]: 2026-01-26 15:15:36.933150753 +0000 UTC m=+0.109130521 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 26 15:15:39 compute-1 nova_compute[183403]: 2026-01-26 15:15:39.641 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:15:40 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:15:40.700 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:ac:38', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'de:96:40:90:f3:e5'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:15:40 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:15:40.701 104930 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 15:15:40 compute-1 nova_compute[183403]: 2026-01-26 15:15:40.701 183407 DEBUG nova.compute.manager [req-c95c070f-ca37-4f67-a415-a39a49be306c req-fe78aab7-4c9e-4912-8214-1b1c123e9b08 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Received event network-vif-unplugged-1cb0a587-a627-48da-a9b1-ab7673e447ef external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:15:40 compute-1 nova_compute[183403]: 2026-01-26 15:15:40.702 183407 DEBUG oslo_concurrency.lockutils [req-c95c070f-ca37-4f67-a415-a39a49be306c req-fe78aab7-4c9e-4912-8214-1b1c123e9b08 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "a4414965-214d-4bdb-8c67-d0774c73ff66-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:15:40 compute-1 nova_compute[183403]: 2026-01-26 15:15:40.702 183407 DEBUG oslo_concurrency.lockutils [req-c95c070f-ca37-4f67-a415-a39a49be306c req-fe78aab7-4c9e-4912-8214-1b1c123e9b08 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "a4414965-214d-4bdb-8c67-d0774c73ff66-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:15:40 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:15:40.702 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=41380b5a-e321-4ce4-bcc6-ecd563b3c793, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:15:40 compute-1 nova_compute[183403]: 2026-01-26 15:15:40.702 183407 DEBUG oslo_concurrency.lockutils [req-c95c070f-ca37-4f67-a415-a39a49be306c req-fe78aab7-4c9e-4912-8214-1b1c123e9b08 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "a4414965-214d-4bdb-8c67-d0774c73ff66-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:15:40 compute-1 nova_compute[183403]: 2026-01-26 15:15:40.703 183407 DEBUG nova.compute.manager [req-c95c070f-ca37-4f67-a415-a39a49be306c req-fe78aab7-4c9e-4912-8214-1b1c123e9b08 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] No event matching network-vif-unplugged-1cb0a587-a627-48da-a9b1-ab7673e447ef in dict_keys([('network-vif-plugged', '1cb0a587-a627-48da-a9b1-ab7673e447ef')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:350
Jan 26 15:15:40 compute-1 nova_compute[183403]: 2026-01-26 15:15:40.703 183407 DEBUG nova.compute.manager [req-c95c070f-ca37-4f67-a415-a39a49be306c req-fe78aab7-4c9e-4912-8214-1b1c123e9b08 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Received event network-vif-unplugged-1cb0a587-a627-48da-a9b1-ab7673e447ef for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 15:15:40 compute-1 nova_compute[183403]: 2026-01-26 15:15:40.704 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:15:41 compute-1 nova_compute[183403]: 2026-01-26 15:15:41.512 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:15:41 compute-1 sshd-session[207753]: Invalid user alan from 185.246.128.170 port 64156
Jan 26 15:15:42 compute-1 nova_compute[183403]: 2026-01-26 15:15:42.214 183407 INFO nova.compute.manager [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Took 8.52 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.
Jan 26 15:15:42 compute-1 nova_compute[183403]: 2026-01-26 15:15:42.860 183407 DEBUG nova.compute.manager [req-bcfa1162-f700-4a97-86f3-267e6818ae0b req-b6a47b63-15f5-4ae1-8ab9-7ef8da8baba7 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Received event network-vif-plugged-1cb0a587-a627-48da-a9b1-ab7673e447ef external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:15:42 compute-1 nova_compute[183403]: 2026-01-26 15:15:42.861 183407 DEBUG oslo_concurrency.lockutils [req-bcfa1162-f700-4a97-86f3-267e6818ae0b req-b6a47b63-15f5-4ae1-8ab9-7ef8da8baba7 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "a4414965-214d-4bdb-8c67-d0774c73ff66-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:15:42 compute-1 nova_compute[183403]: 2026-01-26 15:15:42.861 183407 DEBUG oslo_concurrency.lockutils [req-bcfa1162-f700-4a97-86f3-267e6818ae0b req-b6a47b63-15f5-4ae1-8ab9-7ef8da8baba7 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "a4414965-214d-4bdb-8c67-d0774c73ff66-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:15:42 compute-1 nova_compute[183403]: 2026-01-26 15:15:42.861 183407 DEBUG oslo_concurrency.lockutils [req-bcfa1162-f700-4a97-86f3-267e6818ae0b req-b6a47b63-15f5-4ae1-8ab9-7ef8da8baba7 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "a4414965-214d-4bdb-8c67-d0774c73ff66-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:15:42 compute-1 nova_compute[183403]: 2026-01-26 15:15:42.861 183407 DEBUG nova.compute.manager [req-bcfa1162-f700-4a97-86f3-267e6818ae0b req-b6a47b63-15f5-4ae1-8ab9-7ef8da8baba7 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Processing event network-vif-plugged-1cb0a587-a627-48da-a9b1-ab7673e447ef _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 26 15:15:42 compute-1 nova_compute[183403]: 2026-01-26 15:15:42.862 183407 DEBUG nova.compute.manager [req-bcfa1162-f700-4a97-86f3-267e6818ae0b req-b6a47b63-15f5-4ae1-8ab9-7ef8da8baba7 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Received event network-changed-1cb0a587-a627-48da-a9b1-ab7673e447ef external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:15:42 compute-1 nova_compute[183403]: 2026-01-26 15:15:42.862 183407 DEBUG nova.compute.manager [req-bcfa1162-f700-4a97-86f3-267e6818ae0b req-b6a47b63-15f5-4ae1-8ab9-7ef8da8baba7 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Refreshing instance network info cache due to event network-changed-1cb0a587-a627-48da-a9b1-ab7673e447ef. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 26 15:15:42 compute-1 nova_compute[183403]: 2026-01-26 15:15:42.862 183407 DEBUG oslo_concurrency.lockutils [req-bcfa1162-f700-4a97-86f3-267e6818ae0b req-b6a47b63-15f5-4ae1-8ab9-7ef8da8baba7 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "refresh_cache-a4414965-214d-4bdb-8c67-d0774c73ff66" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 15:15:42 compute-1 nova_compute[183403]: 2026-01-26 15:15:42.862 183407 DEBUG oslo_concurrency.lockutils [req-bcfa1162-f700-4a97-86f3-267e6818ae0b req-b6a47b63-15f5-4ae1-8ab9-7ef8da8baba7 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquired lock "refresh_cache-a4414965-214d-4bdb-8c67-d0774c73ff66" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 15:15:42 compute-1 nova_compute[183403]: 2026-01-26 15:15:42.863 183407 DEBUG nova.network.neutron [req-bcfa1162-f700-4a97-86f3-267e6818ae0b req-b6a47b63-15f5-4ae1-8ab9-7ef8da8baba7 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Refreshing network info cache for port 1cb0a587-a627-48da-a9b1-ab7673e447ef _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 26 15:15:42 compute-1 nova_compute[183403]: 2026-01-26 15:15:42.864 183407 DEBUG nova.compute.manager [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 26 15:15:43 compute-1 nova_compute[183403]: 2026-01-26 15:15:43.369 183407 WARNING neutronclient.v2_0.client [req-bcfa1162-f700-4a97-86f3-267e6818ae0b req-b6a47b63-15f5-4ae1-8ab9-7ef8da8baba7 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:15:43 compute-1 nova_compute[183403]: 2026-01-26 15:15:43.376 183407 DEBUG nova.compute.manager [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpoey4ybrf',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='a4414965-214d-4bdb-8c67-d0774c73ff66',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(64bb08fc-292f-4849-952c-8f6eac3e7fcc),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9663
Jan 26 15:15:43 compute-1 sshd-session[207753]: Disconnecting invalid user alan 185.246.128.170 port 64156: Change of username or service not allowed: (alan,ssh-connection) -> (theta,ssh-connection) [preauth]
Jan 26 15:15:43 compute-1 nova_compute[183403]: 2026-01-26 15:15:43.909 183407 DEBUG nova.objects.instance [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lazy-loading 'migration_context' on Instance uuid a4414965-214d-4bdb-8c67-d0774c73ff66 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:15:43 compute-1 nova_compute[183403]: 2026-01-26 15:15:43.910 183407 DEBUG nova.virt.libvirt.driver [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Jan 26 15:15:43 compute-1 nova_compute[183403]: 2026-01-26 15:15:43.911 183407 DEBUG nova.virt.libvirt.driver [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Jan 26 15:15:43 compute-1 nova_compute[183403]: 2026-01-26 15:15:43.911 183407 DEBUG nova.virt.libvirt.driver [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Jan 26 15:15:44 compute-1 nova_compute[183403]: 2026-01-26 15:15:44.414 183407 DEBUG nova.virt.libvirt.driver [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Jan 26 15:15:44 compute-1 nova_compute[183403]: 2026-01-26 15:15:44.414 183407 DEBUG nova.virt.libvirt.driver [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Jan 26 15:15:44 compute-1 nova_compute[183403]: 2026-01-26 15:15:44.429 183407 DEBUG nova.virt.libvirt.vif [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2026-01-26T15:14:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-919966140',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-919966140',id=11,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T15:14:54Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='53b3f69d4de4466089ba08d308b821a1',ramdisk_id='',reservation_id='r-y70rxn4m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,reader,admin,member',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-1373478536',owner_user_name='tempest-TestExecuteBasicStrategy-1373478536-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T15:14:54Z,user_data=None,user_id='7936c169fc9442e1865811f8febd438d',uuid=a4414965-214d-4bdb-8c67-d0774c73ff66,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1cb0a587-a627-48da-a9b1-ab7673e447ef", "address": "fa:16:3e:e2:90:44", "network": {"id": "95b60e5a-ede2-4bc6-80f8-81c875c613ca", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-413334478-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1eff4cc862a94ba0b9bdd2e7cd089d3b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap1cb0a587-a6", "ovs_interfaceid": "1cb0a587-a627-48da-a9b1-ab7673e447ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 26 15:15:44 compute-1 nova_compute[183403]: 2026-01-26 15:15:44.429 183407 DEBUG nova.network.os_vif_util [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Converting VIF {"id": "1cb0a587-a627-48da-a9b1-ab7673e447ef", "address": "fa:16:3e:e2:90:44", "network": {"id": "95b60e5a-ede2-4bc6-80f8-81c875c613ca", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-413334478-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1eff4cc862a94ba0b9bdd2e7cd089d3b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap1cb0a587-a6", "ovs_interfaceid": "1cb0a587-a627-48da-a9b1-ab7673e447ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:15:44 compute-1 nova_compute[183403]: 2026-01-26 15:15:44.430 183407 DEBUG nova.network.os_vif_util [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e2:90:44,bridge_name='br-int',has_traffic_filtering=True,id=1cb0a587-a627-48da-a9b1-ab7673e447ef,network=Network(95b60e5a-ede2-4bc6-80f8-81c875c613ca),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1cb0a587-a6') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:15:44 compute-1 nova_compute[183403]: 2026-01-26 15:15:44.430 183407 DEBUG nova.virt.libvirt.migration [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Updating guest XML with vif config: <interface type="ethernet">
Jan 26 15:15:44 compute-1 nova_compute[183403]:   <mac address="fa:16:3e:e2:90:44"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   <model type="virtio"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   <driver name="vhost" rx_queue_size="512"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   <mtu size="1442"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   <target dev="tap1cb0a587-a6"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]: </interface>
Jan 26 15:15:44 compute-1 nova_compute[183403]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Jan 26 15:15:44 compute-1 nova_compute[183403]: 2026-01-26 15:15:44.431 183407 DEBUG nova.virt.libvirt.migration [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Jan 26 15:15:44 compute-1 nova_compute[183403]:   <name>instance-0000000b</name>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   <uuid>a4414965-214d-4bdb-8c67-d0774c73ff66</uuid>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   <metadata>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <nova:name>tempest-TestExecuteBasicStrategy-server-919966140</nova:name>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <nova:creationTime>2026-01-26 15:14:45</nova:creationTime>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <nova:flavor name="m1.nano" id="74480e15-23e6-4569-8ef9-3ddf5ac8b981">
Jan 26 15:15:44 compute-1 nova_compute[183403]:         <nova:memory>128</nova:memory>
Jan 26 15:15:44 compute-1 nova_compute[183403]:         <nova:disk>1</nova:disk>
Jan 26 15:15:44 compute-1 nova_compute[183403]:         <nova:swap>0</nova:swap>
Jan 26 15:15:44 compute-1 nova_compute[183403]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 15:15:44 compute-1 nova_compute[183403]:         <nova:vcpus>1</nova:vcpus>
Jan 26 15:15:44 compute-1 nova_compute[183403]:         <nova:extraSpecs>
Jan 26 15:15:44 compute-1 nova_compute[183403]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 15:15:44 compute-1 nova_compute[183403]:         </nova:extraSpecs>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       </nova:flavor>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <nova:image uuid="354e4d0e-4287-404f-93d3-2c85cfe92fbc">
Jan 26 15:15:44 compute-1 nova_compute[183403]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 15:15:44 compute-1 nova_compute[183403]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 15:15:44 compute-1 nova_compute[183403]:         <nova:minDisk>1</nova:minDisk>
Jan 26 15:15:44 compute-1 nova_compute[183403]:         <nova:minRam>0</nova:minRam>
Jan 26 15:15:44 compute-1 nova_compute[183403]:         <nova:properties>
Jan 26 15:15:44 compute-1 nova_compute[183403]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 15:15:44 compute-1 nova_compute[183403]:         </nova:properties>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       </nova:image>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <nova:owner>
Jan 26 15:15:44 compute-1 nova_compute[183403]:         <nova:user uuid="7936c169fc9442e1865811f8febd438d">tempest-TestExecuteBasicStrategy-1373478536-project-admin</nova:user>
Jan 26 15:15:44 compute-1 nova_compute[183403]:         <nova:project uuid="53b3f69d4de4466089ba08d308b821a1">tempest-TestExecuteBasicStrategy-1373478536</nova:project>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       </nova:owner>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <nova:root type="image" uuid="354e4d0e-4287-404f-93d3-2c85cfe92fbc"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <nova:ports>
Jan 26 15:15:44 compute-1 nova_compute[183403]:         <nova:port uuid="1cb0a587-a627-48da-a9b1-ab7673e447ef">
Jan 26 15:15:44 compute-1 nova_compute[183403]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:         </nova:port>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       </nova:ports>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </nova:instance>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   </metadata>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   <memory unit="KiB">131072</memory>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   <currentMemory unit="KiB">131072</currentMemory>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   <vcpu placement="static">1</vcpu>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   <resource>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <partition>/machine</partition>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   </resource>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   <sysinfo type="smbios">
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <system>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <entry name="manufacturer">RDO</entry>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <entry name="product">OpenStack Compute</entry>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <entry name="serial">a4414965-214d-4bdb-8c67-d0774c73ff66</entry>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <entry name="uuid">a4414965-214d-4bdb-8c67-d0774c73ff66</entry>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <entry name="family">Virtual Machine</entry>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </system>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   </sysinfo>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   <os>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <boot dev="hd"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <smbios mode="sysinfo"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   </os>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   <features>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <acpi/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <apic/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <vmcoreinfo state="on"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   </features>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   <cpu mode="custom" match="exact" check="partial">
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <model fallback="allow">Nehalem</model>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   </cpu>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   <clock offset="utc">
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <timer name="hpet" present="no"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   </clock>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   <on_poweroff>destroy</on_poweroff>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   <on_reboot>restart</on_reboot>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   <on_crash>destroy</on_crash>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   <devices>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <disk type="file" device="disk">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <source file="/var/lib/nova/instances/a4414965-214d-4bdb-8c67-d0774c73ff66/disk"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target dev="vda" bus="virtio"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </disk>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <disk type="file" device="cdrom">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <source file="/var/lib/nova/instances/a4414965-214d-4bdb-8c67-d0774c73ff66/disk.config"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target dev="sda" bus="sata"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <readonly/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </disk>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="0" model="pcie-root"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="1" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="1" port="0x10"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="2" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="2" port="0x11"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="3" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="3" port="0x12"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="4" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="4" port="0x13"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="5" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="5" port="0x14"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="6" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="6" port="0x15"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="7" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="7" port="0x16"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="8" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="8" port="0x17"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="9" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="9" port="0x18"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="10" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="10" port="0x19"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="11" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="11" port="0x1a"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="12" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="12" port="0x1b"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="13" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="13" port="0x1c"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="14" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="14" port="0x1d"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="15" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="15" port="0x1e"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="16" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="16" port="0x1f"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="17" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="17" port="0x20"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="18" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="18" port="0x21"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="19" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="19" port="0x22"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="20" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="20" port="0x23"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="21" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="21" port="0x24"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="22" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="22" port="0x25"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="23" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="23" port="0x26"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="24" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="24" port="0x27"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="25" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="25" port="0x28"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-pci-bridge"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="usb" index="0" model="piix3-uhci">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="sata" index="0">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <interface type="ethernet"><mac address="fa:16:3e:e2:90:44"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap1cb0a587-a6"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </interface><serial type="pty">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <log file="/var/lib/nova/instances/a4414965-214d-4bdb-8c67-d0774c73ff66/console.log" append="off"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target type="isa-serial" port="0">
Jan 26 15:15:44 compute-1 nova_compute[183403]:         <model name="isa-serial"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       </target>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </serial>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <console type="pty">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <log file="/var/lib/nova/instances/a4414965-214d-4bdb-8c67-d0774c73ff66/console.log" append="off"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target type="serial" port="0"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </console>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <input type="tablet" bus="usb">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="usb" bus="0" port="1"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </input>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <input type="mouse" bus="ps2"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <listen type="address" address="::"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </graphics>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <video>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model type="virtio" heads="1" primary="yes"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </video>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <stats period="10"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </memballoon>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <rng model="virtio">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <backend model="random">/dev/urandom</backend>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </rng>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   </devices>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]: </domain>
Jan 26 15:15:44 compute-1 nova_compute[183403]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Jan 26 15:15:44 compute-1 nova_compute[183403]: 2026-01-26 15:15:44.432 183407 DEBUG nova.virt.libvirt.migration [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Jan 26 15:15:44 compute-1 nova_compute[183403]:   <name>instance-0000000b</name>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   <uuid>a4414965-214d-4bdb-8c67-d0774c73ff66</uuid>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   <metadata>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <nova:name>tempest-TestExecuteBasicStrategy-server-919966140</nova:name>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <nova:creationTime>2026-01-26 15:14:45</nova:creationTime>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <nova:flavor name="m1.nano" id="74480e15-23e6-4569-8ef9-3ddf5ac8b981">
Jan 26 15:15:44 compute-1 nova_compute[183403]:         <nova:memory>128</nova:memory>
Jan 26 15:15:44 compute-1 nova_compute[183403]:         <nova:disk>1</nova:disk>
Jan 26 15:15:44 compute-1 nova_compute[183403]:         <nova:swap>0</nova:swap>
Jan 26 15:15:44 compute-1 nova_compute[183403]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 15:15:44 compute-1 nova_compute[183403]:         <nova:vcpus>1</nova:vcpus>
Jan 26 15:15:44 compute-1 nova_compute[183403]:         <nova:extraSpecs>
Jan 26 15:15:44 compute-1 nova_compute[183403]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 15:15:44 compute-1 nova_compute[183403]:         </nova:extraSpecs>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       </nova:flavor>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <nova:image uuid="354e4d0e-4287-404f-93d3-2c85cfe92fbc">
Jan 26 15:15:44 compute-1 nova_compute[183403]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 15:15:44 compute-1 nova_compute[183403]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 15:15:44 compute-1 nova_compute[183403]:         <nova:minDisk>1</nova:minDisk>
Jan 26 15:15:44 compute-1 nova_compute[183403]:         <nova:minRam>0</nova:minRam>
Jan 26 15:15:44 compute-1 nova_compute[183403]:         <nova:properties>
Jan 26 15:15:44 compute-1 nova_compute[183403]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 15:15:44 compute-1 nova_compute[183403]:         </nova:properties>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       </nova:image>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <nova:owner>
Jan 26 15:15:44 compute-1 nova_compute[183403]:         <nova:user uuid="7936c169fc9442e1865811f8febd438d">tempest-TestExecuteBasicStrategy-1373478536-project-admin</nova:user>
Jan 26 15:15:44 compute-1 nova_compute[183403]:         <nova:project uuid="53b3f69d4de4466089ba08d308b821a1">tempest-TestExecuteBasicStrategy-1373478536</nova:project>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       </nova:owner>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <nova:root type="image" uuid="354e4d0e-4287-404f-93d3-2c85cfe92fbc"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <nova:ports>
Jan 26 15:15:44 compute-1 nova_compute[183403]:         <nova:port uuid="1cb0a587-a627-48da-a9b1-ab7673e447ef">
Jan 26 15:15:44 compute-1 nova_compute[183403]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:         </nova:port>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       </nova:ports>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </nova:instance>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   </metadata>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   <memory unit="KiB">131072</memory>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   <currentMemory unit="KiB">131072</currentMemory>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   <vcpu placement="static">1</vcpu>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   <resource>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <partition>/machine</partition>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   </resource>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   <sysinfo type="smbios">
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <system>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <entry name="manufacturer">RDO</entry>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <entry name="product">OpenStack Compute</entry>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <entry name="serial">a4414965-214d-4bdb-8c67-d0774c73ff66</entry>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <entry name="uuid">a4414965-214d-4bdb-8c67-d0774c73ff66</entry>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <entry name="family">Virtual Machine</entry>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </system>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   </sysinfo>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   <os>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <boot dev="hd"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <smbios mode="sysinfo"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   </os>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   <features>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <acpi/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <apic/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <vmcoreinfo state="on"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   </features>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   <cpu mode="custom" match="exact" check="partial">
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <model fallback="allow">Nehalem</model>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   </cpu>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   <clock offset="utc">
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <timer name="hpet" present="no"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   </clock>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   <on_poweroff>destroy</on_poweroff>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   <on_reboot>restart</on_reboot>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   <on_crash>destroy</on_crash>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   <devices>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <disk type="file" device="disk">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <source file="/var/lib/nova/instances/a4414965-214d-4bdb-8c67-d0774c73ff66/disk"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target dev="vda" bus="virtio"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </disk>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <disk type="file" device="cdrom">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <source file="/var/lib/nova/instances/a4414965-214d-4bdb-8c67-d0774c73ff66/disk.config"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target dev="sda" bus="sata"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <readonly/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </disk>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="0" model="pcie-root"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="1" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="1" port="0x10"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="2" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="2" port="0x11"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="3" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="3" port="0x12"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="4" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="4" port="0x13"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="5" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="5" port="0x14"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="6" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="6" port="0x15"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="7" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="7" port="0x16"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="8" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="8" port="0x17"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="9" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="9" port="0x18"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="10" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="10" port="0x19"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="11" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="11" port="0x1a"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="12" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="12" port="0x1b"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="13" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="13" port="0x1c"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="14" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="14" port="0x1d"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="15" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="15" port="0x1e"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="16" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="16" port="0x1f"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="17" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="17" port="0x20"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="18" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="18" port="0x21"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="19" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="19" port="0x22"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="20" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="20" port="0x23"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="21" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="21" port="0x24"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="22" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="22" port="0x25"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="23" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="23" port="0x26"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="24" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="24" port="0x27"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="25" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="25" port="0x28"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-pci-bridge"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="usb" index="0" model="piix3-uhci">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="sata" index="0">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <interface type="ethernet"><mac address="fa:16:3e:e2:90:44"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap1cb0a587-a6"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </interface><serial type="pty">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <log file="/var/lib/nova/instances/a4414965-214d-4bdb-8c67-d0774c73ff66/console.log" append="off"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target type="isa-serial" port="0">
Jan 26 15:15:44 compute-1 nova_compute[183403]:         <model name="isa-serial"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       </target>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </serial>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <console type="pty">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <log file="/var/lib/nova/instances/a4414965-214d-4bdb-8c67-d0774c73ff66/console.log" append="off"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target type="serial" port="0"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </console>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <input type="tablet" bus="usb">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="usb" bus="0" port="1"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </input>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <input type="mouse" bus="ps2"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <listen type="address" address="::"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </graphics>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <video>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model type="virtio" heads="1" primary="yes"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </video>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <stats period="10"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </memballoon>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <rng model="virtio">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <backend model="random">/dev/urandom</backend>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </rng>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   </devices>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]: </domain>
Jan 26 15:15:44 compute-1 nova_compute[183403]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Jan 26 15:15:44 compute-1 nova_compute[183403]: 2026-01-26 15:15:44.433 183407 DEBUG nova.virt.libvirt.migration [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] _update_pci_xml output xml=<domain type="kvm">
Jan 26 15:15:44 compute-1 nova_compute[183403]:   <name>instance-0000000b</name>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   <uuid>a4414965-214d-4bdb-8c67-d0774c73ff66</uuid>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   <metadata>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <nova:name>tempest-TestExecuteBasicStrategy-server-919966140</nova:name>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <nova:creationTime>2026-01-26 15:14:45</nova:creationTime>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <nova:flavor name="m1.nano" id="74480e15-23e6-4569-8ef9-3ddf5ac8b981">
Jan 26 15:15:44 compute-1 nova_compute[183403]:         <nova:memory>128</nova:memory>
Jan 26 15:15:44 compute-1 nova_compute[183403]:         <nova:disk>1</nova:disk>
Jan 26 15:15:44 compute-1 nova_compute[183403]:         <nova:swap>0</nova:swap>
Jan 26 15:15:44 compute-1 nova_compute[183403]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 15:15:44 compute-1 nova_compute[183403]:         <nova:vcpus>1</nova:vcpus>
Jan 26 15:15:44 compute-1 nova_compute[183403]:         <nova:extraSpecs>
Jan 26 15:15:44 compute-1 nova_compute[183403]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 15:15:44 compute-1 nova_compute[183403]:         </nova:extraSpecs>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       </nova:flavor>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <nova:image uuid="354e4d0e-4287-404f-93d3-2c85cfe92fbc">
Jan 26 15:15:44 compute-1 nova_compute[183403]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 15:15:44 compute-1 nova_compute[183403]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 15:15:44 compute-1 nova_compute[183403]:         <nova:minDisk>1</nova:minDisk>
Jan 26 15:15:44 compute-1 nova_compute[183403]:         <nova:minRam>0</nova:minRam>
Jan 26 15:15:44 compute-1 nova_compute[183403]:         <nova:properties>
Jan 26 15:15:44 compute-1 nova_compute[183403]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 15:15:44 compute-1 nova_compute[183403]:         </nova:properties>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       </nova:image>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <nova:owner>
Jan 26 15:15:44 compute-1 nova_compute[183403]:         <nova:user uuid="7936c169fc9442e1865811f8febd438d">tempest-TestExecuteBasicStrategy-1373478536-project-admin</nova:user>
Jan 26 15:15:44 compute-1 nova_compute[183403]:         <nova:project uuid="53b3f69d4de4466089ba08d308b821a1">tempest-TestExecuteBasicStrategy-1373478536</nova:project>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       </nova:owner>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <nova:root type="image" uuid="354e4d0e-4287-404f-93d3-2c85cfe92fbc"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <nova:ports>
Jan 26 15:15:44 compute-1 nova_compute[183403]:         <nova:port uuid="1cb0a587-a627-48da-a9b1-ab7673e447ef">
Jan 26 15:15:44 compute-1 nova_compute[183403]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:         </nova:port>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       </nova:ports>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </nova:instance>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   </metadata>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   <memory unit="KiB">131072</memory>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   <currentMemory unit="KiB">131072</currentMemory>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   <vcpu placement="static">1</vcpu>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   <resource>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <partition>/machine</partition>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   </resource>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   <sysinfo type="smbios">
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <system>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <entry name="manufacturer">RDO</entry>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <entry name="product">OpenStack Compute</entry>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <entry name="serial">a4414965-214d-4bdb-8c67-d0774c73ff66</entry>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <entry name="uuid">a4414965-214d-4bdb-8c67-d0774c73ff66</entry>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <entry name="family">Virtual Machine</entry>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </system>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   </sysinfo>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   <os>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <boot dev="hd"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <smbios mode="sysinfo"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   </os>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   <features>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <acpi/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <apic/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <vmcoreinfo state="on"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   </features>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   <cpu mode="custom" match="exact" check="partial">
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <model fallback="allow">Nehalem</model>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   </cpu>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   <clock offset="utc">
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <timer name="hpet" present="no"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   </clock>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   <on_poweroff>destroy</on_poweroff>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   <on_reboot>restart</on_reboot>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   <on_crash>destroy</on_crash>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   <devices>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <disk type="file" device="disk">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <source file="/var/lib/nova/instances/a4414965-214d-4bdb-8c67-d0774c73ff66/disk"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target dev="vda" bus="virtio"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </disk>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <disk type="file" device="cdrom">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <source file="/var/lib/nova/instances/a4414965-214d-4bdb-8c67-d0774c73ff66/disk.config"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target dev="sda" bus="sata"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <readonly/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </disk>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="0" model="pcie-root"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="1" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="1" port="0x10"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="2" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="2" port="0x11"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="3" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="3" port="0x12"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="4" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="4" port="0x13"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="5" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="5" port="0x14"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="6" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="6" port="0x15"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="7" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="7" port="0x16"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="8" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="8" port="0x17"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="9" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="9" port="0x18"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="10" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="10" port="0x19"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="11" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="11" port="0x1a"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="12" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="12" port="0x1b"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="13" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="13" port="0x1c"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="14" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="14" port="0x1d"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="15" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="15" port="0x1e"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="16" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="16" port="0x1f"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="17" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="17" port="0x20"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="18" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="18" port="0x21"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="19" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="19" port="0x22"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="20" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="20" port="0x23"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="21" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="21" port="0x24"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="22" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="22" port="0x25"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="23" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="23" port="0x26"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="24" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="24" port="0x27"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="25" model="pcie-root-port">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target chassis="25" port="0x28"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model name="pcie-pci-bridge"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="usb" index="0" model="piix3-uhci">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <controller type="sata" index="0">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <interface type="ethernet"><mac address="fa:16:3e:e2:90:44"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap1cb0a587-a6"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </interface><serial type="pty">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <log file="/var/lib/nova/instances/a4414965-214d-4bdb-8c67-d0774c73ff66/console.log" append="off"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target type="isa-serial" port="0">
Jan 26 15:15:44 compute-1 nova_compute[183403]:         <model name="isa-serial"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       </target>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </serial>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <console type="pty">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <log file="/var/lib/nova/instances/a4414965-214d-4bdb-8c67-d0774c73ff66/console.log" append="off"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <target type="serial" port="0"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </console>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <input type="tablet" bus="usb">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="usb" bus="0" port="1"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </input>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <input type="mouse" bus="ps2"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <listen type="address" address="::"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </graphics>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <video>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <model type="virtio" heads="1" primary="yes"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </video>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <stats period="10"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </memballoon>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     <rng model="virtio">
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <backend model="random">/dev/urandom</backend>
Jan 26 15:15:44 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]:     </rng>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   </devices>
Jan 26 15:15:44 compute-1 nova_compute[183403]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Jan 26 15:15:44 compute-1 nova_compute[183403]: </domain>
Jan 26 15:15:44 compute-1 nova_compute[183403]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Jan 26 15:15:44 compute-1 nova_compute[183403]: 2026-01-26 15:15:44.433 183407 DEBUG nova.virt.libvirt.driver [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Jan 26 15:15:44 compute-1 nova_compute[183403]: 2026-01-26 15:15:44.643 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:15:44 compute-1 nova_compute[183403]: 2026-01-26 15:15:44.917 183407 DEBUG nova.virt.libvirt.migration [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Current None elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Jan 26 15:15:44 compute-1 nova_compute[183403]: 2026-01-26 15:15:44.918 183407 INFO nova.virt.libvirt.migration [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Increasing downtime to 50 ms after 0 sec elapsed time
Jan 26 15:15:45 compute-1 nova_compute[183403]: 2026-01-26 15:15:45.258 183407 WARNING neutronclient.v2_0.client [req-bcfa1162-f700-4a97-86f3-267e6818ae0b req-b6a47b63-15f5-4ae1-8ab9-7ef8da8baba7 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:15:45 compute-1 nova_compute[183403]: 2026-01-26 15:15:45.477 183407 DEBUG nova.network.neutron [req-bcfa1162-f700-4a97-86f3-267e6818ae0b req-b6a47b63-15f5-4ae1-8ab9-7ef8da8baba7 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Updated VIF entry in instance network info cache for port 1cb0a587-a627-48da-a9b1-ab7673e447ef. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Jan 26 15:15:45 compute-1 nova_compute[183403]: 2026-01-26 15:15:45.477 183407 DEBUG nova.network.neutron [req-bcfa1162-f700-4a97-86f3-267e6818ae0b req-b6a47b63-15f5-4ae1-8ab9-7ef8da8baba7 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Updating instance_info_cache with network_info: [{"id": "1cb0a587-a627-48da-a9b1-ab7673e447ef", "address": "fa:16:3e:e2:90:44", "network": {"id": "95b60e5a-ede2-4bc6-80f8-81c875c613ca", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-413334478-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1eff4cc862a94ba0b9bdd2e7cd089d3b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1cb0a587-a6", "ovs_interfaceid": "1cb0a587-a627-48da-a9b1-ab7673e447ef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:15:46 compute-1 nova_compute[183403]: 2026-01-26 15:15:46.022 183407 DEBUG oslo_concurrency.lockutils [req-bcfa1162-f700-4a97-86f3-267e6818ae0b req-b6a47b63-15f5-4ae1-8ab9-7ef8da8baba7 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Releasing lock "refresh_cache-a4414965-214d-4bdb-8c67-d0774c73ff66" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 15:15:46 compute-1 kernel: tap1cb0a587-a6 (unregistering): left promiscuous mode
Jan 26 15:15:46 compute-1 NetworkManager[55716]: <info>  [1769440546.1861] device (tap1cb0a587-a6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 15:15:46 compute-1 ovn_controller[95641]: 2026-01-26T15:15:46Z|00101|binding|INFO|Releasing lport 1cb0a587-a627-48da-a9b1-ab7673e447ef from this chassis (sb_readonly=0)
Jan 26 15:15:46 compute-1 nova_compute[183403]: 2026-01-26 15:15:46.194 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:15:46 compute-1 ovn_controller[95641]: 2026-01-26T15:15:46Z|00102|binding|INFO|Setting lport 1cb0a587-a627-48da-a9b1-ab7673e447ef down in Southbound
Jan 26 15:15:46 compute-1 ovn_controller[95641]: 2026-01-26T15:15:46Z|00103|binding|INFO|Removing iface tap1cb0a587-a6 ovn-installed in OVS
Jan 26 15:15:46 compute-1 nova_compute[183403]: 2026-01-26 15:15:46.196 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:15:46 compute-1 nova_compute[183403]: 2026-01-26 15:15:46.210 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:15:46 compute-1 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000b.scope: Deactivated successfully.
Jan 26 15:15:46 compute-1 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000b.scope: Consumed 14.667s CPU time.
Jan 26 15:15:46 compute-1 systemd-machined[154697]: Machine qemu-8-instance-0000000b terminated.
Jan 26 15:15:46 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:15:46.404 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e2:90:44 10.100.0.5'], port_security=['fa:16:3e:e2:90:44 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '3e0272b2-d627-4653-a221-12286e3af322'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'a4414965-214d-4bdb-8c67-d0774c73ff66', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-95b60e5a-ede2-4bc6-80f8-81c875c613ca', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '53b3f69d4de4466089ba08d308b821a1', 'neutron:revision_number': '10', 'neutron:security_group_ids': '4f9e494b-2632-4b9d-8b3a-97aadbce4827', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=078a9aa2-ec53-4a6e-9fce-171c59eb7b8d, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], logical_port=1cb0a587-a627-48da-a9b1-ab7673e447ef) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:15:46 compute-1 ovn_controller[95641]: 2026-01-26T15:15:46Z|00104|binding|INFO|Releasing lport f8cc5f9d-b902-4d9f-8c23-350733c66989 from this chassis (sb_readonly=0)
Jan 26 15:15:46 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:15:46.404 104930 INFO neutron.agent.ovn.metadata.agent [-] Port 1cb0a587-a627-48da-a9b1-ab7673e447ef in datapath 95b60e5a-ede2-4bc6-80f8-81c875c613ca unbound from our chassis
Jan 26 15:15:46 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:15:46.406 104930 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 95b60e5a-ede2-4bc6-80f8-81c875c613ca, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 15:15:46 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:15:46.407 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[a98ac5fc-164e-462f-9f1a-811e0accb914]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:15:46 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:15:46.408 104930 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-95b60e5a-ede2-4bc6-80f8-81c875c613ca namespace which is not needed anymore
Jan 26 15:15:46 compute-1 nova_compute[183403]: 2026-01-26 15:15:46.446 183407 DEBUG nova.virt.libvirt.driver [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Jan 26 15:15:46 compute-1 nova_compute[183403]: 2026-01-26 15:15:46.446 183407 DEBUG nova.virt.libvirt.driver [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Jan 26 15:15:46 compute-1 nova_compute[183403]: 2026-01-26 15:15:46.446 183407 DEBUG nova.virt.libvirt.driver [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Jan 26 15:15:46 compute-1 nova_compute[183403]: 2026-01-26 15:15:46.449 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:15:46 compute-1 nova_compute[183403]: 2026-01-26 15:15:46.513 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:15:46 compute-1 neutron-haproxy-ovnmeta-95b60e5a-ede2-4bc6-80f8-81c875c613ca[207517]: [NOTICE]   (207521) : haproxy version is 3.0.5-8e879a5
Jan 26 15:15:46 compute-1 neutron-haproxy-ovnmeta-95b60e5a-ede2-4bc6-80f8-81c875c613ca[207517]: [NOTICE]   (207521) : path to executable is /usr/sbin/haproxy
Jan 26 15:15:46 compute-1 neutron-haproxy-ovnmeta-95b60e5a-ede2-4bc6-80f8-81c875c613ca[207517]: [WARNING]  (207521) : Exiting Master process...
Jan 26 15:15:46 compute-1 podman[207816]: 2026-01-26 15:15:46.531156903 +0000 UTC m=+0.027674225 container kill b5e4cd5a9117946b410724334684aa025f8b490836776ebbe20ba9e85e0caa3b (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-95b60e5a-ede2-4bc6-80f8-81c875c613ca, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260120)
Jan 26 15:15:46 compute-1 neutron-haproxy-ovnmeta-95b60e5a-ede2-4bc6-80f8-81c875c613ca[207517]: [ALERT]    (207521) : Current worker (207523) exited with code 143 (Terminated)
Jan 26 15:15:46 compute-1 neutron-haproxy-ovnmeta-95b60e5a-ede2-4bc6-80f8-81c875c613ca[207517]: [WARNING]  (207521) : All workers exited. Exiting... (0)
Jan 26 15:15:46 compute-1 systemd[1]: libpod-b5e4cd5a9117946b410724334684aa025f8b490836776ebbe20ba9e85e0caa3b.scope: Deactivated successfully.
Jan 26 15:15:46 compute-1 podman[207832]: 2026-01-26 15:15:46.573660365 +0000 UTC m=+0.027370738 container died b5e4cd5a9117946b410724334684aa025f8b490836776ebbe20ba9e85e0caa3b (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-95b60e5a-ede2-4bc6-80f8-81c875c613ca, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260120, tcib_managed=true)
Jan 26 15:15:46 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b5e4cd5a9117946b410724334684aa025f8b490836776ebbe20ba9e85e0caa3b-userdata-shm.mount: Deactivated successfully.
Jan 26 15:15:46 compute-1 systemd[1]: var-lib-containers-storage-overlay-4e453138fdea51e8c720e039034c340eb73742d8e963831191fad92a481fdf06-merged.mount: Deactivated successfully.
Jan 26 15:15:46 compute-1 podman[207832]: 2026-01-26 15:15:46.612940757 +0000 UTC m=+0.066651110 container cleanup b5e4cd5a9117946b410724334684aa025f8b490836776ebbe20ba9e85e0caa3b (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-95b60e5a-ede2-4bc6-80f8-81c875c613ca, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20260120, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Jan 26 15:15:46 compute-1 systemd[1]: libpod-conmon-b5e4cd5a9117946b410724334684aa025f8b490836776ebbe20ba9e85e0caa3b.scope: Deactivated successfully.
Jan 26 15:15:46 compute-1 podman[207839]: 2026-01-26 15:15:46.636965431 +0000 UTC m=+0.072149895 container remove b5e4cd5a9117946b410724334684aa025f8b490836776ebbe20ba9e85e0caa3b (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-95b60e5a-ede2-4bc6-80f8-81c875c613ca, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 26 15:15:46 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:15:46.645 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[9ee90e45-4ae0-4cee-a894-921ee309eb69]: (4, ("Mon Jan 26 03:15:46 PM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-95b60e5a-ede2-4bc6-80f8-81c875c613ca (b5e4cd5a9117946b410724334684aa025f8b490836776ebbe20ba9e85e0caa3b)\nb5e4cd5a9117946b410724334684aa025f8b490836776ebbe20ba9e85e0caa3b\nMon Jan 26 03:15:46 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-95b60e5a-ede2-4bc6-80f8-81c875c613ca (b5e4cd5a9117946b410724334684aa025f8b490836776ebbe20ba9e85e0caa3b)\nb5e4cd5a9117946b410724334684aa025f8b490836776ebbe20ba9e85e0caa3b\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:15:46 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:15:46.646 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[338b8bdc-ed74-454f-9087-8ed9326f0bbb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:15:46 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:15:46.647 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/95b60e5a-ede2-4bc6-80f8-81c875c613ca.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/95b60e5a-ede2-4bc6-80f8-81c875c613ca.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:15:46 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:15:46.647 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[d7fef2e5-5674-46ae-966a-a49c78a8a1c4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:15:46 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:15:46.648 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap95b60e5a-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:15:46 compute-1 kernel: tap95b60e5a-e0: left promiscuous mode
Jan 26 15:15:46 compute-1 nova_compute[183403]: 2026-01-26 15:15:46.650 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:15:46 compute-1 nova_compute[183403]: 2026-01-26 15:15:46.678 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:15:46 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:15:46.682 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[90e15d3d-a092-4641-a51e-deebf8325fcc]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:15:46 compute-1 nova_compute[183403]: 2026-01-26 15:15:46.691 183407 INFO nova.virt.libvirt.driver [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Jan 26 15:15:46 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:15:46.695 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[c8ee7117-a582-4da7-9d2a-2ad86b362ab1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:15:46 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:15:46.696 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[ceaa949b-e6a2-4655-91b9-beeb8cdf4949]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:15:46 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:15:46.710 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[e1260ab3-f87b-4b1e-bdce-587fb4aff09d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 421471, 'reachable_time': 41448, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 207867, 'error': None, 'target': 'ovnmeta-95b60e5a-ede2-4bc6-80f8-81c875c613ca', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:15:46 compute-1 systemd[1]: run-netns-ovnmeta\x2d95b60e5a\x2dede2\x2d4bc6\x2d80f8\x2d81c875c613ca.mount: Deactivated successfully.
Jan 26 15:15:46 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:15:46.714 105448 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-95b60e5a-ede2-4bc6-80f8-81c875c613ca deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Jan 26 15:15:46 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:15:46.715 105448 DEBUG oslo.privsep.daemon [-] privsep: reply[ed765797-c4cb-426f-b3cc-18cf8d502fb9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:15:47 compute-1 nova_compute[183403]: 2026-01-26 15:15:47.193 183407 DEBUG nova.virt.libvirt.guest [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid 'a4414965-214d-4bdb-8c67-d0774c73ff66' (instance-0000000b) get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Jan 26 15:15:47 compute-1 nova_compute[183403]: 2026-01-26 15:15:47.195 183407 INFO nova.virt.libvirt.driver [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Migration operation has completed
Jan 26 15:15:47 compute-1 nova_compute[183403]: 2026-01-26 15:15:47.195 183407 INFO nova.compute.manager [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] _post_live_migration() is started..
Jan 26 15:15:47 compute-1 nova_compute[183403]: 2026-01-26 15:15:47.210 183407 WARNING neutronclient.v2_0.client [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:15:47 compute-1 nova_compute[183403]: 2026-01-26 15:15:47.210 183407 WARNING neutronclient.v2_0.client [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:15:47 compute-1 nova_compute[183403]: 2026-01-26 15:15:47.585 183407 DEBUG nova.compute.manager [req-d4c66245-0881-457d-882c-81b0229acbcf req-acebcfe2-1250-458d-b0ac-c994bf6439db 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Received event network-vif-unplugged-1cb0a587-a627-48da-a9b1-ab7673e447ef external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:15:47 compute-1 nova_compute[183403]: 2026-01-26 15:15:47.586 183407 DEBUG oslo_concurrency.lockutils [req-d4c66245-0881-457d-882c-81b0229acbcf req-acebcfe2-1250-458d-b0ac-c994bf6439db 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "a4414965-214d-4bdb-8c67-d0774c73ff66-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:15:47 compute-1 nova_compute[183403]: 2026-01-26 15:15:47.586 183407 DEBUG oslo_concurrency.lockutils [req-d4c66245-0881-457d-882c-81b0229acbcf req-acebcfe2-1250-458d-b0ac-c994bf6439db 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "a4414965-214d-4bdb-8c67-d0774c73ff66-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:15:47 compute-1 nova_compute[183403]: 2026-01-26 15:15:47.587 183407 DEBUG oslo_concurrency.lockutils [req-d4c66245-0881-457d-882c-81b0229acbcf req-acebcfe2-1250-458d-b0ac-c994bf6439db 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "a4414965-214d-4bdb-8c67-d0774c73ff66-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:15:47 compute-1 nova_compute[183403]: 2026-01-26 15:15:47.587 183407 DEBUG nova.compute.manager [req-d4c66245-0881-457d-882c-81b0229acbcf req-acebcfe2-1250-458d-b0ac-c994bf6439db 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] No waiting events found dispatching network-vif-unplugged-1cb0a587-a627-48da-a9b1-ab7673e447ef pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:15:47 compute-1 nova_compute[183403]: 2026-01-26 15:15:47.587 183407 DEBUG nova.compute.manager [req-d4c66245-0881-457d-882c-81b0229acbcf req-acebcfe2-1250-458d-b0ac-c994bf6439db 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Received event network-vif-unplugged-1cb0a587-a627-48da-a9b1-ab7673e447ef for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 15:15:48 compute-1 nova_compute[183403]: 2026-01-26 15:15:48.390 183407 DEBUG nova.compute.manager [req-0663fb95-0a2b-4672-b0f2-264c6adfaa0d req-d9a1de41-befc-4c4a-847f-0728374a6327 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Received event network-vif-unplugged-1cb0a587-a627-48da-a9b1-ab7673e447ef external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:15:48 compute-1 nova_compute[183403]: 2026-01-26 15:15:48.391 183407 DEBUG oslo_concurrency.lockutils [req-0663fb95-0a2b-4672-b0f2-264c6adfaa0d req-d9a1de41-befc-4c4a-847f-0728374a6327 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "a4414965-214d-4bdb-8c67-d0774c73ff66-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:15:48 compute-1 nova_compute[183403]: 2026-01-26 15:15:48.392 183407 DEBUG oslo_concurrency.lockutils [req-0663fb95-0a2b-4672-b0f2-264c6adfaa0d req-d9a1de41-befc-4c4a-847f-0728374a6327 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "a4414965-214d-4bdb-8c67-d0774c73ff66-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:15:48 compute-1 nova_compute[183403]: 2026-01-26 15:15:48.392 183407 DEBUG oslo_concurrency.lockutils [req-0663fb95-0a2b-4672-b0f2-264c6adfaa0d req-d9a1de41-befc-4c4a-847f-0728374a6327 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "a4414965-214d-4bdb-8c67-d0774c73ff66-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:15:48 compute-1 nova_compute[183403]: 2026-01-26 15:15:48.392 183407 DEBUG nova.compute.manager [req-0663fb95-0a2b-4672-b0f2-264c6adfaa0d req-d9a1de41-befc-4c4a-847f-0728374a6327 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] No waiting events found dispatching network-vif-unplugged-1cb0a587-a627-48da-a9b1-ab7673e447ef pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:15:48 compute-1 nova_compute[183403]: 2026-01-26 15:15:48.393 183407 DEBUG nova.compute.manager [req-0663fb95-0a2b-4672-b0f2-264c6adfaa0d req-d9a1de41-befc-4c4a-847f-0728374a6327 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Received event network-vif-unplugged-1cb0a587-a627-48da-a9b1-ab7673e447ef for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 15:15:48 compute-1 nova_compute[183403]: 2026-01-26 15:15:48.588 183407 DEBUG nova.network.neutron [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Activated binding for port 1cb0a587-a627-48da-a9b1-ab7673e447ef and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Jan 26 15:15:48 compute-1 nova_compute[183403]: 2026-01-26 15:15:48.589 183407 DEBUG nova.compute.manager [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "1cb0a587-a627-48da-a9b1-ab7673e447ef", "address": "fa:16:3e:e2:90:44", "network": {"id": "95b60e5a-ede2-4bc6-80f8-81c875c613ca", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-413334478-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1eff4cc862a94ba0b9bdd2e7cd089d3b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1cb0a587-a6", "ovs_interfaceid": "1cb0a587-a627-48da-a9b1-ab7673e447ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10063
Jan 26 15:15:48 compute-1 nova_compute[183403]: 2026-01-26 15:15:48.590 183407 DEBUG nova.virt.libvirt.vif [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2026-01-26T15:14:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-919966140',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-919966140',id=11,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T15:14:54Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='53b3f69d4de4466089ba08d308b821a1',ramdisk_id='',reservation_id='r-y70rxn4m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,reader,admin,member',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-1373478536',owner_user_name='tempest-TestExecuteBasicStrategy-1373478536-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T15:15:22Z,user_data=None,user_id='7936c169fc9442e1865811f8febd438d',uuid=a4414965-214d-4bdb-8c67-d0774c73ff66,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1cb0a587-a627-48da-a9b1-ab7673e447ef", "address": "fa:16:3e:e2:90:44", "network": {"id": "95b60e5a-ede2-4bc6-80f8-81c875c613ca", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-413334478-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1eff4cc862a94ba0b9bdd2e7cd089d3b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1cb0a587-a6", "ovs_interfaceid": "1cb0a587-a627-48da-a9b1-ab7673e447ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 26 15:15:48 compute-1 nova_compute[183403]: 2026-01-26 15:15:48.591 183407 DEBUG nova.network.os_vif_util [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Converting VIF {"id": "1cb0a587-a627-48da-a9b1-ab7673e447ef", "address": "fa:16:3e:e2:90:44", "network": {"id": "95b60e5a-ede2-4bc6-80f8-81c875c613ca", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-413334478-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1eff4cc862a94ba0b9bdd2e7cd089d3b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1cb0a587-a6", "ovs_interfaceid": "1cb0a587-a627-48da-a9b1-ab7673e447ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:15:48 compute-1 nova_compute[183403]: 2026-01-26 15:15:48.592 183407 DEBUG nova.network.os_vif_util [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e2:90:44,bridge_name='br-int',has_traffic_filtering=True,id=1cb0a587-a627-48da-a9b1-ab7673e447ef,network=Network(95b60e5a-ede2-4bc6-80f8-81c875c613ca),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1cb0a587-a6') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:15:48 compute-1 nova_compute[183403]: 2026-01-26 15:15:48.592 183407 DEBUG os_vif [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e2:90:44,bridge_name='br-int',has_traffic_filtering=True,id=1cb0a587-a627-48da-a9b1-ab7673e447ef,network=Network(95b60e5a-ede2-4bc6-80f8-81c875c613ca),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1cb0a587-a6') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 26 15:15:48 compute-1 nova_compute[183403]: 2026-01-26 15:15:48.596 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:15:48 compute-1 nova_compute[183403]: 2026-01-26 15:15:48.597 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1cb0a587-a6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:15:48 compute-1 nova_compute[183403]: 2026-01-26 15:15:48.599 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:15:48 compute-1 nova_compute[183403]: 2026-01-26 15:15:48.602 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:15:48 compute-1 nova_compute[183403]: 2026-01-26 15:15:48.604 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:15:48 compute-1 nova_compute[183403]: 2026-01-26 15:15:48.604 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=c79c5b9b-6fea-4215-bf29-0dcdddfe214e) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:15:48 compute-1 nova_compute[183403]: 2026-01-26 15:15:48.605 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:15:48 compute-1 nova_compute[183403]: 2026-01-26 15:15:48.606 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:15:48 compute-1 nova_compute[183403]: 2026-01-26 15:15:48.617 183407 INFO os_vif [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e2:90:44,bridge_name='br-int',has_traffic_filtering=True,id=1cb0a587-a627-48da-a9b1-ab7673e447ef,network=Network(95b60e5a-ede2-4bc6-80f8-81c875c613ca),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1cb0a587-a6')
Jan 26 15:15:48 compute-1 nova_compute[183403]: 2026-01-26 15:15:48.618 183407 DEBUG oslo_concurrency.lockutils [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:15:48 compute-1 nova_compute[183403]: 2026-01-26 15:15:48.618 183407 DEBUG oslo_concurrency.lockutils [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:15:48 compute-1 nova_compute[183403]: 2026-01-26 15:15:48.619 183407 DEBUG oslo_concurrency.lockutils [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:15:48 compute-1 nova_compute[183403]: 2026-01-26 15:15:48.619 183407 DEBUG nova.compute.manager [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10086
Jan 26 15:15:48 compute-1 nova_compute[183403]: 2026-01-26 15:15:48.620 183407 INFO nova.virt.libvirt.driver [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Deleting instance files /var/lib/nova/instances/a4414965-214d-4bdb-8c67-d0774c73ff66_del
Jan 26 15:15:48 compute-1 nova_compute[183403]: 2026-01-26 15:15:48.620 183407 INFO nova.virt.libvirt.driver [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Deletion of /var/lib/nova/instances/a4414965-214d-4bdb-8c67-d0774c73ff66_del complete
Jan 26 15:15:49 compute-1 openstack_network_exporter[195610]: ERROR   15:15:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:15:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:15:49 compute-1 openstack_network_exporter[195610]: ERROR   15:15:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:15:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:15:49 compute-1 nova_compute[183403]: 2026-01-26 15:15:49.645 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:15:49 compute-1 nova_compute[183403]: 2026-01-26 15:15:49.677 183407 DEBUG nova.compute.manager [req-6716795c-063d-4fd0-a1ac-cf67480e42df req-17f135c2-e75e-4446-9fcd-46c788bc1370 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Received event network-vif-plugged-1cb0a587-a627-48da-a9b1-ab7673e447ef external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:15:49 compute-1 nova_compute[183403]: 2026-01-26 15:15:49.677 183407 DEBUG oslo_concurrency.lockutils [req-6716795c-063d-4fd0-a1ac-cf67480e42df req-17f135c2-e75e-4446-9fcd-46c788bc1370 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "a4414965-214d-4bdb-8c67-d0774c73ff66-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:15:49 compute-1 nova_compute[183403]: 2026-01-26 15:15:49.678 183407 DEBUG oslo_concurrency.lockutils [req-6716795c-063d-4fd0-a1ac-cf67480e42df req-17f135c2-e75e-4446-9fcd-46c788bc1370 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "a4414965-214d-4bdb-8c67-d0774c73ff66-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:15:49 compute-1 nova_compute[183403]: 2026-01-26 15:15:49.678 183407 DEBUG oslo_concurrency.lockutils [req-6716795c-063d-4fd0-a1ac-cf67480e42df req-17f135c2-e75e-4446-9fcd-46c788bc1370 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "a4414965-214d-4bdb-8c67-d0774c73ff66-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:15:49 compute-1 nova_compute[183403]: 2026-01-26 15:15:49.678 183407 DEBUG nova.compute.manager [req-6716795c-063d-4fd0-a1ac-cf67480e42df req-17f135c2-e75e-4446-9fcd-46c788bc1370 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] No waiting events found dispatching network-vif-plugged-1cb0a587-a627-48da-a9b1-ab7673e447ef pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:15:49 compute-1 nova_compute[183403]: 2026-01-26 15:15:49.679 183407 WARNING nova.compute.manager [req-6716795c-063d-4fd0-a1ac-cf67480e42df req-17f135c2-e75e-4446-9fcd-46c788bc1370 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Received unexpected event network-vif-plugged-1cb0a587-a627-48da-a9b1-ab7673e447ef for instance with vm_state active and task_state migrating.
Jan 26 15:15:49 compute-1 nova_compute[183403]: 2026-01-26 15:15:49.679 183407 DEBUG nova.compute.manager [req-6716795c-063d-4fd0-a1ac-cf67480e42df req-17f135c2-e75e-4446-9fcd-46c788bc1370 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Received event network-vif-unplugged-1cb0a587-a627-48da-a9b1-ab7673e447ef external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:15:49 compute-1 nova_compute[183403]: 2026-01-26 15:15:49.679 183407 DEBUG oslo_concurrency.lockutils [req-6716795c-063d-4fd0-a1ac-cf67480e42df req-17f135c2-e75e-4446-9fcd-46c788bc1370 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "a4414965-214d-4bdb-8c67-d0774c73ff66-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:15:49 compute-1 nova_compute[183403]: 2026-01-26 15:15:49.680 183407 DEBUG oslo_concurrency.lockutils [req-6716795c-063d-4fd0-a1ac-cf67480e42df req-17f135c2-e75e-4446-9fcd-46c788bc1370 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "a4414965-214d-4bdb-8c67-d0774c73ff66-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:15:49 compute-1 nova_compute[183403]: 2026-01-26 15:15:49.680 183407 DEBUG oslo_concurrency.lockutils [req-6716795c-063d-4fd0-a1ac-cf67480e42df req-17f135c2-e75e-4446-9fcd-46c788bc1370 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "a4414965-214d-4bdb-8c67-d0774c73ff66-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:15:49 compute-1 nova_compute[183403]: 2026-01-26 15:15:49.680 183407 DEBUG nova.compute.manager [req-6716795c-063d-4fd0-a1ac-cf67480e42df req-17f135c2-e75e-4446-9fcd-46c788bc1370 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] No waiting events found dispatching network-vif-unplugged-1cb0a587-a627-48da-a9b1-ab7673e447ef pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:15:49 compute-1 nova_compute[183403]: 2026-01-26 15:15:49.680 183407 DEBUG nova.compute.manager [req-6716795c-063d-4fd0-a1ac-cf67480e42df req-17f135c2-e75e-4446-9fcd-46c788bc1370 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Received event network-vif-unplugged-1cb0a587-a627-48da-a9b1-ab7673e447ef for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 15:15:49 compute-1 nova_compute[183403]: 2026-01-26 15:15:49.680 183407 DEBUG nova.compute.manager [req-6716795c-063d-4fd0-a1ac-cf67480e42df req-17f135c2-e75e-4446-9fcd-46c788bc1370 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Received event network-vif-plugged-1cb0a587-a627-48da-a9b1-ab7673e447ef external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:15:49 compute-1 nova_compute[183403]: 2026-01-26 15:15:49.680 183407 DEBUG oslo_concurrency.lockutils [req-6716795c-063d-4fd0-a1ac-cf67480e42df req-17f135c2-e75e-4446-9fcd-46c788bc1370 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "a4414965-214d-4bdb-8c67-d0774c73ff66-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:15:49 compute-1 nova_compute[183403]: 2026-01-26 15:15:49.680 183407 DEBUG oslo_concurrency.lockutils [req-6716795c-063d-4fd0-a1ac-cf67480e42df req-17f135c2-e75e-4446-9fcd-46c788bc1370 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "a4414965-214d-4bdb-8c67-d0774c73ff66-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:15:49 compute-1 nova_compute[183403]: 2026-01-26 15:15:49.681 183407 DEBUG oslo_concurrency.lockutils [req-6716795c-063d-4fd0-a1ac-cf67480e42df req-17f135c2-e75e-4446-9fcd-46c788bc1370 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "a4414965-214d-4bdb-8c67-d0774c73ff66-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:15:49 compute-1 nova_compute[183403]: 2026-01-26 15:15:49.681 183407 DEBUG nova.compute.manager [req-6716795c-063d-4fd0-a1ac-cf67480e42df req-17f135c2-e75e-4446-9fcd-46c788bc1370 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] No waiting events found dispatching network-vif-plugged-1cb0a587-a627-48da-a9b1-ab7673e447ef pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:15:49 compute-1 nova_compute[183403]: 2026-01-26 15:15:49.681 183407 WARNING nova.compute.manager [req-6716795c-063d-4fd0-a1ac-cf67480e42df req-17f135c2-e75e-4446-9fcd-46c788bc1370 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Received unexpected event network-vif-plugged-1cb0a587-a627-48da-a9b1-ab7673e447ef for instance with vm_state active and task_state migrating.
Jan 26 15:15:52 compute-1 nova_compute[183403]: 2026-01-26 15:15:52.027 183407 DEBUG nova.compute.manager [req-06e49096-c25d-4140-aea3-83a60f62ac68 req-31d0599a-1d76-46f8-bc00-5e706661bee0 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Received event network-vif-plugged-1cb0a587-a627-48da-a9b1-ab7673e447ef external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:15:52 compute-1 nova_compute[183403]: 2026-01-26 15:15:52.028 183407 DEBUG oslo_concurrency.lockutils [req-06e49096-c25d-4140-aea3-83a60f62ac68 req-31d0599a-1d76-46f8-bc00-5e706661bee0 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "a4414965-214d-4bdb-8c67-d0774c73ff66-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:15:52 compute-1 nova_compute[183403]: 2026-01-26 15:15:52.028 183407 DEBUG oslo_concurrency.lockutils [req-06e49096-c25d-4140-aea3-83a60f62ac68 req-31d0599a-1d76-46f8-bc00-5e706661bee0 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "a4414965-214d-4bdb-8c67-d0774c73ff66-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:15:52 compute-1 nova_compute[183403]: 2026-01-26 15:15:52.029 183407 DEBUG oslo_concurrency.lockutils [req-06e49096-c25d-4140-aea3-83a60f62ac68 req-31d0599a-1d76-46f8-bc00-5e706661bee0 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "a4414965-214d-4bdb-8c67-d0774c73ff66-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:15:52 compute-1 nova_compute[183403]: 2026-01-26 15:15:52.029 183407 DEBUG nova.compute.manager [req-06e49096-c25d-4140-aea3-83a60f62ac68 req-31d0599a-1d76-46f8-bc00-5e706661bee0 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] No waiting events found dispatching network-vif-plugged-1cb0a587-a627-48da-a9b1-ab7673e447ef pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:15:52 compute-1 nova_compute[183403]: 2026-01-26 15:15:52.029 183407 WARNING nova.compute.manager [req-06e49096-c25d-4140-aea3-83a60f62ac68 req-31d0599a-1d76-46f8-bc00-5e706661bee0 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Received unexpected event network-vif-plugged-1cb0a587-a627-48da-a9b1-ab7673e447ef for instance with vm_state active and task_state migrating.
Jan 26 15:15:53 compute-1 nova_compute[183403]: 2026-01-26 15:15:53.606 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:15:54 compute-1 nova_compute[183403]: 2026-01-26 15:15:54.647 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:15:57 compute-1 podman[207870]: 2026-01-26 15:15:57.919365327 +0000 UTC m=+0.078166215 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, version=9.6, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, vcs-type=git)
Jan 26 15:15:57 compute-1 podman[207869]: 2026-01-26 15:15:57.919388448 +0000 UTC m=+0.092393557 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 26 15:15:58 compute-1 nova_compute[183403]: 2026-01-26 15:15:58.446 183407 DEBUG oslo_concurrency.lockutils [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "a4414965-214d-4bdb-8c67-d0774c73ff66-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:15:58 compute-1 nova_compute[183403]: 2026-01-26 15:15:58.447 183407 DEBUG oslo_concurrency.lockutils [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "a4414965-214d-4bdb-8c67-d0774c73ff66-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:15:58 compute-1 nova_compute[183403]: 2026-01-26 15:15:58.447 183407 DEBUG oslo_concurrency.lockutils [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "a4414965-214d-4bdb-8c67-d0774c73ff66-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:15:58 compute-1 nova_compute[183403]: 2026-01-26 15:15:58.609 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:15:58 compute-1 sshd-session[207772]: Invalid user theta from 185.246.128.170 port 51213
Jan 26 15:15:58 compute-1 nova_compute[183403]: 2026-01-26 15:15:58.966 183407 DEBUG oslo_concurrency.lockutils [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:15:58 compute-1 nova_compute[183403]: 2026-01-26 15:15:58.967 183407 DEBUG oslo_concurrency.lockutils [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:15:58 compute-1 nova_compute[183403]: 2026-01-26 15:15:58.967 183407 DEBUG oslo_concurrency.lockutils [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:15:58 compute-1 nova_compute[183403]: 2026-01-26 15:15:58.968 183407 DEBUG nova.compute.resource_tracker [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:15:59 compute-1 nova_compute[183403]: 2026-01-26 15:15:59.154 183407 WARNING nova.virt.libvirt.driver [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:15:59 compute-1 nova_compute[183403]: 2026-01-26 15:15:59.156 183407 DEBUG oslo_concurrency.processutils [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:15:59 compute-1 nova_compute[183403]: 2026-01-26 15:15:59.183 183407 DEBUG oslo_concurrency.processutils [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "env LANG=C uptime" returned: 0 in 0.027s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:15:59 compute-1 nova_compute[183403]: 2026-01-26 15:15:59.184 183407 DEBUG nova.compute.resource_tracker [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5838MB free_disk=73.1479377746582GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:15:59 compute-1 nova_compute[183403]: 2026-01-26 15:15:59.184 183407 DEBUG oslo_concurrency.lockutils [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:15:59 compute-1 nova_compute[183403]: 2026-01-26 15:15:59.185 183407 DEBUG oslo_concurrency.lockutils [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:15:59 compute-1 sshd-session[207772]: Disconnecting invalid user theta 185.246.128.170 port 51213: Change of username or service not allowed: (theta,ssh-connection) -> (instrument,ssh-connection) [preauth]
Jan 26 15:15:59 compute-1 nova_compute[183403]: 2026-01-26 15:15:59.657 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:16:00 compute-1 nova_compute[183403]: 2026-01-26 15:16:00.209 183407 DEBUG nova.compute.resource_tracker [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Migration for instance a4414965-214d-4bdb-8c67-d0774c73ff66 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Jan 26 15:16:00 compute-1 nova_compute[183403]: 2026-01-26 15:16:00.727 183407 DEBUG nova.compute.resource_tracker [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Jan 26 15:16:00 compute-1 nova_compute[183403]: 2026-01-26 15:16:00.753 183407 DEBUG nova.compute.resource_tracker [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Migration 64bb08fc-292f-4849-952c-8f6eac3e7fcc is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Jan 26 15:16:00 compute-1 nova_compute[183403]: 2026-01-26 15:16:00.753 183407 DEBUG nova.compute.resource_tracker [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:16:00 compute-1 nova_compute[183403]: 2026-01-26 15:16:00.754 183407 DEBUG nova.compute.resource_tracker [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:15:59 up  1:11,  0 user,  load average: 0.11, 0.21, 0.32\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:16:00 compute-1 nova_compute[183403]: 2026-01-26 15:16:00.794 183407 DEBUG nova.compute.provider_tree [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:16:01 compute-1 nova_compute[183403]: 2026-01-26 15:16:01.304 183407 DEBUG nova.scheduler.client.report [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:16:01 compute-1 nova_compute[183403]: 2026-01-26 15:16:01.812 183407 DEBUG nova.compute.resource_tracker [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:16:01 compute-1 nova_compute[183403]: 2026-01-26 15:16:01.813 183407 DEBUG oslo_concurrency.lockutils [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.628s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:16:01 compute-1 nova_compute[183403]: 2026-01-26 15:16:01.834 183407 INFO nova.compute.manager [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Migrating instance to compute-0.ctlplane.example.com finished successfully.
Jan 26 15:16:02 compute-1 nova_compute[183403]: 2026-01-26 15:16:02.907 183407 INFO nova.scheduler.client.report [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Deleted allocation for migration 64bb08fc-292f-4849-952c-8f6eac3e7fcc
Jan 26 15:16:02 compute-1 nova_compute[183403]: 2026-01-26 15:16:02.908 183407 DEBUG nova.virt.libvirt.driver [None req-baf429e3-62b4-4a5e-8833-f66615b67198 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: a4414965-214d-4bdb-8c67-d0774c73ff66] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Jan 26 15:16:03 compute-1 nova_compute[183403]: 2026-01-26 15:16:03.575 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:16:03 compute-1 nova_compute[183403]: 2026-01-26 15:16:03.612 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:16:04 compute-1 nova_compute[183403]: 2026-01-26 15:16:04.659 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:16:05 compute-1 podman[192725]: time="2026-01-26T15:16:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:16:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:16:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:16:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:16:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2189 "" "Go-http-client/1.1"
Jan 26 15:16:06 compute-1 nova_compute[183403]: 2026-01-26 15:16:06.575 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:16:06 compute-1 sshd-session[207917]: Invalid user instrument from 185.246.128.170 port 62651
Jan 26 15:16:06 compute-1 sshd-session[207917]: Disconnecting invalid user instrument 185.246.128.170 port 62651: Change of username or service not allowed: (instrument,ssh-connection) -> (landscape,ssh-connection) [preauth]
Jan 26 15:16:07 compute-1 nova_compute[183403]: 2026-01-26 15:16:07.089 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:16:07 compute-1 nova_compute[183403]: 2026-01-26 15:16:07.089 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:16:07 compute-1 nova_compute[183403]: 2026-01-26 15:16:07.090 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:16:07 compute-1 nova_compute[183403]: 2026-01-26 15:16:07.090 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:16:07 compute-1 nova_compute[183403]: 2026-01-26 15:16:07.304 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:16:07 compute-1 nova_compute[183403]: 2026-01-26 15:16:07.306 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:16:07 compute-1 nova_compute[183403]: 2026-01-26 15:16:07.326 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.020s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:16:07 compute-1 nova_compute[183403]: 2026-01-26 15:16:07.327 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5837MB free_disk=73.1479377746582GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:16:07 compute-1 nova_compute[183403]: 2026-01-26 15:16:07.328 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:16:07 compute-1 nova_compute[183403]: 2026-01-26 15:16:07.328 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:16:07 compute-1 podman[207922]: 2026-01-26 15:16:07.915187885 +0000 UTC m=+0.083062716 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260120)
Jan 26 15:16:07 compute-1 podman[207921]: 2026-01-26 15:16:07.993677266 +0000 UTC m=+0.160842980 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4)
Jan 26 15:16:08 compute-1 nova_compute[183403]: 2026-01-26 15:16:08.382 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:16:08 compute-1 nova_compute[183403]: 2026-01-26 15:16:08.383 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:16:07 up  1:11,  0 user,  load average: 0.10, 0.21, 0.32\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:16:08 compute-1 nova_compute[183403]: 2026-01-26 15:16:08.412 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:16:08 compute-1 nova_compute[183403]: 2026-01-26 15:16:08.615 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:16:08 compute-1 nova_compute[183403]: 2026-01-26 15:16:08.920 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:16:09 compute-1 nova_compute[183403]: 2026-01-26 15:16:09.433 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:16:09 compute-1 nova_compute[183403]: 2026-01-26 15:16:09.434 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.106s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:16:09 compute-1 nova_compute[183403]: 2026-01-26 15:16:09.434 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:16:09 compute-1 nova_compute[183403]: 2026-01-26 15:16:09.435 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Jan 26 15:16:09 compute-1 nova_compute[183403]: 2026-01-26 15:16:09.663 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:16:09 compute-1 nova_compute[183403]: 2026-01-26 15:16:09.941 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Jan 26 15:16:09 compute-1 nova_compute[183403]: 2026-01-26 15:16:09.942 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:16:09 compute-1 nova_compute[183403]: 2026-01-26 15:16:09.943 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Jan 26 15:16:11 compute-1 nova_compute[183403]: 2026-01-26 15:16:11.451 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:16:11 compute-1 nova_compute[183403]: 2026-01-26 15:16:11.451 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:16:11 compute-1 nova_compute[183403]: 2026-01-26 15:16:11.452 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:16:11 compute-1 nova_compute[183403]: 2026-01-26 15:16:11.452 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:16:11 compute-1 nova_compute[183403]: 2026-01-26 15:16:11.453 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 15:16:11 compute-1 nova_compute[183403]: 2026-01-26 15:16:11.573 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:16:13 compute-1 nova_compute[183403]: 2026-01-26 15:16:13.626 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:16:14 compute-1 nova_compute[183403]: 2026-01-26 15:16:14.082 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:16:14 compute-1 nova_compute[183403]: 2026-01-26 15:16:14.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:16:14 compute-1 nova_compute[183403]: 2026-01-26 15:16:14.663 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:16:16 compute-1 sshd-session[207967]: Invalid user landscape from 185.246.128.170 port 34837
Jan 26 15:16:17 compute-1 nova_compute[183403]: 2026-01-26 15:16:17.567 183407 DEBUG nova.compute.manager [None req-41683a66-624b-4191-973f-b9c5695b2695 c36e2929624c484886f7858d405633e8 179f3c996d8f4e7ea1b0aca3ec76f02e - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:632
Jan 26 15:16:17 compute-1 nova_compute[183403]: 2026-01-26 15:16:17.639 183407 DEBUG nova.compute.provider_tree [None req-41683a66-624b-4191-973f-b9c5695b2695 c36e2929624c484886f7858d405633e8 179f3c996d8f4e7ea1b0aca3ec76f02e - - default default] Updating resource provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 generation from 12 to 15 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Jan 26 15:16:17 compute-1 sshd-session[207967]: Disconnecting invalid user landscape 185.246.128.170 port 34837: Change of username or service not allowed: (landscape,ssh-connection) -> (user03,ssh-connection) [preauth]
Jan 26 15:16:18 compute-1 nova_compute[183403]: 2026-01-26 15:16:18.628 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:16:19 compute-1 openstack_network_exporter[195610]: ERROR   15:16:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:16:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:16:19 compute-1 openstack_network_exporter[195610]: ERROR   15:16:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:16:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:16:19 compute-1 nova_compute[183403]: 2026-01-26 15:16:19.664 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:16:23 compute-1 nova_compute[183403]: 2026-01-26 15:16:23.630 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:16:24 compute-1 nova_compute[183403]: 2026-01-26 15:16:24.667 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:16:25 compute-1 nova_compute[183403]: 2026-01-26 15:16:25.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:16:28 compute-1 nova_compute[183403]: 2026-01-26 15:16:28.634 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:16:28 compute-1 podman[207971]: 2026-01-26 15:16:28.913898628 +0000 UTC m=+0.081241616 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 26 15:16:28 compute-1 podman[207972]: 2026-01-26 15:16:28.932922822 +0000 UTC m=+0.096617364 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, release=1755695350, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, distribution-scope=public, managed_by=edpm_ansible, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, architecture=x86_64, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41)
Jan 26 15:16:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:16:29.045 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:16:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:16:29.045 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:16:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:16:29.045 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:16:29 compute-1 nova_compute[183403]: 2026-01-26 15:16:29.670 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:16:32 compute-1 sshd-session[207969]: Invalid user user03 from 185.246.128.170 port 18182
Jan 26 15:16:33 compute-1 sshd-session[207969]: Disconnecting invalid user user03 185.246.128.170 port 18182: Change of username or service not allowed: (user03,ssh-connection) -> (craft,ssh-connection) [preauth]
Jan 26 15:16:33 compute-1 nova_compute[183403]: 2026-01-26 15:16:33.636 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:16:34 compute-1 nova_compute[183403]: 2026-01-26 15:16:34.672 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:16:35 compute-1 podman[192725]: time="2026-01-26T15:16:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:16:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:16:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:16:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:16:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2183 "" "Go-http-client/1.1"
Jan 26 15:16:37 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:16:37.456 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:86:da:e1 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-21ea6b9f-952b-4141-b3fa-f10dcf67c493', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-21ea6b9f-952b-4141-b3fa-f10dcf67c493', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c1b69143c75c4eceb9883fed8a97d1b5', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2d326aad-ed69-40a3-8ce2-b26797d1910e, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=36a76274-3bec-4fa7-97f2-181acc258cc3) old=Port_Binding(mac=['fa:16:3e:86:da:e1'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-21ea6b9f-952b-4141-b3fa-f10dcf67c493', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-21ea6b9f-952b-4141-b3fa-f10dcf67c493', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c1b69143c75c4eceb9883fed8a97d1b5', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:16:37 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:16:37.458 104930 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 36a76274-3bec-4fa7-97f2-181acc258cc3 in datapath 21ea6b9f-952b-4141-b3fa-f10dcf67c493 updated
Jan 26 15:16:37 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:16:37.459 104930 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 21ea6b9f-952b-4141-b3fa-f10dcf67c493, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 15:16:37 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:16:37.461 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[0247b448-d686-415f-9daf-10b8036fb719]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:16:38 compute-1 nova_compute[183403]: 2026-01-26 15:16:38.639 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:16:38 compute-1 podman[208020]: 2026-01-26 15:16:38.901721588 +0000 UTC m=+0.072332379 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_metadata_agent)
Jan 26 15:16:38 compute-1 podman[208019]: 2026-01-26 15:16:38.93233301 +0000 UTC m=+0.115632749 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ovn_controller)
Jan 26 15:16:39 compute-1 nova_compute[183403]: 2026-01-26 15:16:39.023 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:16:39 compute-1 nova_compute[183403]: 2026-01-26 15:16:39.673 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:16:41 compute-1 nova_compute[183403]: 2026-01-26 15:16:41.319 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:16:41 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:16:41.320 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:ac:38', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'de:96:40:90:f3:e5'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:16:41 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:16:41.321 104930 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 15:16:41 compute-1 sshd-session[208017]: Invalid user craft from 185.246.128.170 port 23856
Jan 26 15:16:42 compute-1 sshd-session[208017]: Disconnecting invalid user craft 185.246.128.170 port 23856: Change of username or service not allowed: (craft,ssh-connection) -> (vodafone,ssh-connection) [preauth]
Jan 26 15:16:43 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:16:43.323 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=41380b5a-e321-4ce4-bcc6-ecd563b3c793, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:16:43 compute-1 nova_compute[183403]: 2026-01-26 15:16:43.642 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:16:44 compute-1 nova_compute[183403]: 2026-01-26 15:16:44.675 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:16:47 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:16:47.401 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:86:b9:c1 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-4524a851-afbf-45e3-a175-7a64775b1fd3', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4524a851-afbf-45e3-a175-7a64775b1fd3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2c208cffd356489c8a846066fa685475', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=59eb205a-46c9-4a83-afd9-70dc87558fbc, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=32140004-faac-4b3e-adc3-80e6679a512d) old=Port_Binding(mac=['fa:16:3e:86:b9:c1'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-4524a851-afbf-45e3-a175-7a64775b1fd3', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4524a851-afbf-45e3-a175-7a64775b1fd3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2c208cffd356489c8a846066fa685475', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:16:47 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:16:47.402 104930 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 32140004-faac-4b3e-adc3-80e6679a512d in datapath 4524a851-afbf-45e3-a175-7a64775b1fd3 updated
Jan 26 15:16:47 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:16:47.403 104930 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4524a851-afbf-45e3-a175-7a64775b1fd3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 15:16:47 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:16:47.404 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[20a3753e-13e3-4bc6-8787-f4147076de43]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:16:48 compute-1 nova_compute[183403]: 2026-01-26 15:16:48.644 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:16:49 compute-1 sshd-session[208065]: Invalid user vodafone from 185.246.128.170 port 57671
Jan 26 15:16:49 compute-1 openstack_network_exporter[195610]: ERROR   15:16:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:16:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:16:49 compute-1 openstack_network_exporter[195610]: ERROR   15:16:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:16:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:16:49 compute-1 sshd-session[208065]: Disconnecting invalid user vodafone 185.246.128.170 port 57671: Change of username or service not allowed: (vodafone,ssh-connection) -> (mary,ssh-connection) [preauth]
Jan 26 15:16:49 compute-1 nova_compute[183403]: 2026-01-26 15:16:49.677 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:16:51 compute-1 sshd-session[208067]: Invalid user sol from 80.94.92.171 port 38872
Jan 26 15:16:52 compute-1 sshd-session[208067]: Connection closed by invalid user sol 80.94.92.171 port 38872 [preauth]
Jan 26 15:16:53 compute-1 nova_compute[183403]: 2026-01-26 15:16:53.647 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:16:54 compute-1 nova_compute[183403]: 2026-01-26 15:16:54.678 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:16:54 compute-1 ovn_controller[95641]: 2026-01-26T15:16:54Z|00105|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Jan 26 15:16:58 compute-1 nova_compute[183403]: 2026-01-26 15:16:58.660 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:16:59 compute-1 nova_compute[183403]: 2026-01-26 15:16:59.725 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:16:59 compute-1 podman[208072]: 2026-01-26 15:16:59.884909507 +0000 UTC m=+0.059197452 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, maintainer=Red Hat, Inc., release=1755695350, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, architecture=x86_64, vendor=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, version=9.6)
Jan 26 15:16:59 compute-1 podman[208071]: 2026-01-26 15:16:59.911612505 +0000 UTC m=+0.092829134 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 26 15:17:03 compute-1 sshd-session[208069]: Invalid user mary from 185.246.128.170 port 46581
Jan 26 15:17:03 compute-1 nova_compute[183403]: 2026-01-26 15:17:03.663 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:17:04 compute-1 sshd-session[208069]: Disconnecting invalid user mary 185.246.128.170 port 46581: Change of username or service not allowed: (mary,ssh-connection) -> (log,ssh-connection) [preauth]
Jan 26 15:17:04 compute-1 nova_compute[183403]: 2026-01-26 15:17:04.726 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:17:05 compute-1 nova_compute[183403]: 2026-01-26 15:17:05.086 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:17:05 compute-1 podman[192725]: time="2026-01-26T15:17:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:17:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:17:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:17:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:17:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2186 "" "Go-http-client/1.1"
Jan 26 15:17:06 compute-1 nova_compute[183403]: 2026-01-26 15:17:06.577 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:17:07 compute-1 nova_compute[183403]: 2026-01-26 15:17:07.093 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:17:07 compute-1 nova_compute[183403]: 2026-01-26 15:17:07.093 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:17:07 compute-1 nova_compute[183403]: 2026-01-26 15:17:07.094 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:17:07 compute-1 nova_compute[183403]: 2026-01-26 15:17:07.094 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:17:07 compute-1 nova_compute[183403]: 2026-01-26 15:17:07.303 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:17:07 compute-1 nova_compute[183403]: 2026-01-26 15:17:07.304 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:17:07 compute-1 nova_compute[183403]: 2026-01-26 15:17:07.334 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.030s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:17:07 compute-1 nova_compute[183403]: 2026-01-26 15:17:07.336 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5849MB free_disk=73.14796829223633GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:17:07 compute-1 nova_compute[183403]: 2026-01-26 15:17:07.336 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:17:07 compute-1 nova_compute[183403]: 2026-01-26 15:17:07.337 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:17:08 compute-1 nova_compute[183403]: 2026-01-26 15:17:08.413 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:17:08 compute-1 nova_compute[183403]: 2026-01-26 15:17:08.413 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:17:07 up  1:12,  0 user,  load average: 0.04, 0.17, 0.30\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:17:08 compute-1 nova_compute[183403]: 2026-01-26 15:17:08.644 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Refreshing inventories for resource provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Jan 26 15:17:08 compute-1 nova_compute[183403]: 2026-01-26 15:17:08.704 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:17:08 compute-1 nova_compute[183403]: 2026-01-26 15:17:08.739 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Updating ProviderTree inventory for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Jan 26 15:17:08 compute-1 nova_compute[183403]: 2026-01-26 15:17:08.740 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Updating inventory in ProviderTree for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Jan 26 15:17:08 compute-1 nova_compute[183403]: 2026-01-26 15:17:08.758 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Refreshing aggregate associations for resource provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Jan 26 15:17:08 compute-1 nova_compute[183403]: 2026-01-26 15:17:08.780 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Refreshing trait associations for resource provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67, traits: COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE41,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NODE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_MMX,COMPUTE_ACCELERATORS,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_ARCH_X86_64,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE2,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_SOUND_MODEL_USB,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_TIS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_DEVICE_TAGGING,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_CRB,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_SOUND_MODEL_SB16,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_QCOW2,HW_ARCH_X86_64,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SECURITY_TPM_2_0 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Jan 26 15:17:08 compute-1 nova_compute[183403]: 2026-01-26 15:17:08.806 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:17:09 compute-1 nova_compute[183403]: 2026-01-26 15:17:09.542 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:17:09 compute-1 nova_compute[183403]: 2026-01-26 15:17:09.761 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:17:09 compute-1 podman[208118]: 2026-01-26 15:17:09.887610841 +0000 UTC m=+0.060065615 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ovn_metadata_agent)
Jan 26 15:17:09 compute-1 podman[208117]: 2026-01-26 15:17:09.93020333 +0000 UTC m=+0.097173478 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4)
Jan 26 15:17:10 compute-1 nova_compute[183403]: 2026-01-26 15:17:10.051 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:17:10 compute-1 nova_compute[183403]: 2026-01-26 15:17:10.052 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.715s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:17:12 compute-1 nova_compute[183403]: 2026-01-26 15:17:12.051 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:17:12 compute-1 nova_compute[183403]: 2026-01-26 15:17:12.051 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:17:12 compute-1 nova_compute[183403]: 2026-01-26 15:17:12.051 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:17:12 compute-1 nova_compute[183403]: 2026-01-26 15:17:12.052 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:17:12 compute-1 nova_compute[183403]: 2026-01-26 15:17:12.052 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 15:17:13 compute-1 nova_compute[183403]: 2026-01-26 15:17:13.572 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:17:13 compute-1 nova_compute[183403]: 2026-01-26 15:17:13.707 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:17:14 compute-1 sshd-session[208115]: Invalid user log from 185.246.128.170 port 57109
Jan 26 15:17:14 compute-1 nova_compute[183403]: 2026-01-26 15:17:14.763 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:17:15 compute-1 nova_compute[183403]: 2026-01-26 15:17:15.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:17:15 compute-1 sshd-session[208115]: Disconnecting invalid user log 185.246.128.170 port 57109: Change of username or service not allowed: (log,ssh-connection) -> (odoo16,ssh-connection) [preauth]
Jan 26 15:17:18 compute-1 nova_compute[183403]: 2026-01-26 15:17:18.709 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:17:19 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:17:19.406 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:09:dc:c5 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-0df777d6-b389-44bc-b166-8208ab926234', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0df777d6-b389-44bc-b166-8208ab926234', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2d0174cdadae4803981796a2cea457d2', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ffea2f62-0986-47bd-a80c-89f1c9decf3f, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=33751b84-abc7-465c-a58b-58ca2b0cbc0a) old=Port_Binding(mac=['fa:16:3e:09:dc:c5'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-0df777d6-b389-44bc-b166-8208ab926234', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0df777d6-b389-44bc-b166-8208ab926234', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2d0174cdadae4803981796a2cea457d2', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:17:19 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:17:19.407 104930 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 33751b84-abc7-465c-a58b-58ca2b0cbc0a in datapath 0df777d6-b389-44bc-b166-8208ab926234 updated
Jan 26 15:17:19 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:17:19.407 104930 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0df777d6-b389-44bc-b166-8208ab926234, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 15:17:19 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:17:19.409 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[77d9d9ee-76d9-468d-a28e-3c25f528c8d1]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:17:19 compute-1 openstack_network_exporter[195610]: ERROR   15:17:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:17:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:17:19 compute-1 openstack_network_exporter[195610]: ERROR   15:17:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:17:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:17:19 compute-1 nova_compute[183403]: 2026-01-26 15:17:19.765 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:17:23 compute-1 nova_compute[183403]: 2026-01-26 15:17:23.737 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:17:24 compute-1 nova_compute[183403]: 2026-01-26 15:17:24.801 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:17:27 compute-1 sshd-session[208164]: Invalid user odoo16 from 185.246.128.170 port 41316
Jan 26 15:17:27 compute-1 sshd-session[208164]: Disconnecting invalid user odoo16 185.246.128.170 port 41316: Change of username or service not allowed: (odoo16,ssh-connection) -> (astra,ssh-connection) [preauth]
Jan 26 15:17:28 compute-1 nova_compute[183403]: 2026-01-26 15:17:28.741 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:17:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:17:29.046 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:17:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:17:29.047 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:17:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:17:29.047 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:17:29 compute-1 nova_compute[183403]: 2026-01-26 15:17:29.804 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:17:30 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:17:30.007 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dd:84:0d 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-79e0c61f-8f5a-4007-bb65-b88dfdddb2b1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-79e0c61f-8f5a-4007-bb65-b88dfdddb2b1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ed11f66f0de4f6191def09f65c67624', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e718840b-e20a-4883-b818-b58eb3e4c4ae, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=1327b5d9-989c-41f3-9cd9-a370bd35d991) old=Port_Binding(mac=['fa:16:3e:dd:84:0d'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-79e0c61f-8f5a-4007-bb65-b88dfdddb2b1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-79e0c61f-8f5a-4007-bb65-b88dfdddb2b1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ed11f66f0de4f6191def09f65c67624', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:17:30 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:17:30.008 104930 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 1327b5d9-989c-41f3-9cd9-a370bd35d991 in datapath 79e0c61f-8f5a-4007-bb65-b88dfdddb2b1 updated
Jan 26 15:17:30 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:17:30.008 104930 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 79e0c61f-8f5a-4007-bb65-b88dfdddb2b1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 15:17:30 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:17:30.012 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[5b75b549-b6f9-4fe4-b162-8bc29d2db987]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:17:30 compute-1 podman[208170]: 2026-01-26 15:17:30.896306162 +0000 UTC m=+0.072638682 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, version=9.6, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350)
Jan 26 15:17:30 compute-1 podman[208169]: 2026-01-26 15:17:30.905596075 +0000 UTC m=+0.077631297 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 15:17:32 compute-1 sshd-session[208166]: Invalid user astra from 185.246.128.170 port 63159
Jan 26 15:17:32 compute-1 sshd-session[208166]: Disconnecting invalid user astra 185.246.128.170 port 63159: Change of username or service not allowed: (astra,ssh-connection) -> (Administrator,ssh-connection) [preauth]
Jan 26 15:17:33 compute-1 nova_compute[183403]: 2026-01-26 15:17:33.743 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:17:34 compute-1 nova_compute[183403]: 2026-01-26 15:17:34.806 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:17:35 compute-1 podman[192725]: time="2026-01-26T15:17:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:17:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:17:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:17:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:17:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2187 "" "Go-http-client/1.1"
Jan 26 15:17:36 compute-1 sshd-session[208213]: Invalid user ubnt from 176.120.22.13 port 37338
Jan 26 15:17:36 compute-1 sshd-session[208213]: Connection reset by invalid user ubnt 176.120.22.13 port 37338 [preauth]
Jan 26 15:17:38 compute-1 nova_compute[183403]: 2026-01-26 15:17:38.747 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:17:39 compute-1 nova_compute[183403]: 2026-01-26 15:17:39.807 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:17:39 compute-1 sshd-session[208215]: Connection reset by authenticating user root 176.120.22.13 port 37352 [preauth]
Jan 26 15:17:40 compute-1 podman[208221]: 2026-01-26 15:17:40.90403635 +0000 UTC m=+0.071492283 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, io.buildah.version=1.41.4)
Jan 26 15:17:40 compute-1 podman[208220]: 2026-01-26 15:17:40.964878745 +0000 UTC m=+0.144975135 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Jan 26 15:17:42 compute-1 sshd-session[208218]: Invalid user admin from 176.120.22.13 port 37364
Jan 26 15:17:42 compute-1 sshd-session[208218]: Connection reset by invalid user admin 176.120.22.13 port 37364 [preauth]
Jan 26 15:17:43 compute-1 nova_compute[183403]: 2026-01-26 15:17:43.749 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:17:44 compute-1 sshd-session[208212]: Invalid user Administrator from 185.246.128.170 port 21346
Jan 26 15:17:44 compute-1 sshd-session[208267]: Invalid user uucp from 176.120.22.13 port 62678
Jan 26 15:17:44 compute-1 nova_compute[183403]: 2026-01-26 15:17:44.810 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:17:45 compute-1 sshd-session[208267]: Connection reset by invalid user uucp 176.120.22.13 port 62678 [preauth]
Jan 26 15:17:47 compute-1 sshd-session[208269]: Connection reset by authenticating user root 176.120.22.13 port 62694 [preauth]
Jan 26 15:17:48 compute-1 nova_compute[183403]: 2026-01-26 15:17:48.793 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:17:49 compute-1 openstack_network_exporter[195610]: ERROR   15:17:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:17:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:17:49 compute-1 openstack_network_exporter[195610]: ERROR   15:17:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:17:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:17:49 compute-1 nova_compute[183403]: 2026-01-26 15:17:49.812 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:17:53 compute-1 nova_compute[183403]: 2026-01-26 15:17:53.826 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:17:54 compute-1 nova_compute[183403]: 2026-01-26 15:17:54.814 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:17:58 compute-1 nova_compute[183403]: 2026-01-26 15:17:58.829 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:17:59 compute-1 nova_compute[183403]: 2026-01-26 15:17:59.349 183407 DEBUG oslo_concurrency.lockutils [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Acquiring lock "c52ee407-1afb-4ae3-ae7f-792592e6badf" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:17:59 compute-1 nova_compute[183403]: 2026-01-26 15:17:59.350 183407 DEBUG oslo_concurrency.lockutils [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Lock "c52ee407-1afb-4ae3-ae7f-792592e6badf" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:17:59 compute-1 nova_compute[183403]: 2026-01-26 15:17:59.817 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:17:59 compute-1 nova_compute[183403]: 2026-01-26 15:17:59.857 183407 DEBUG nova.compute.manager [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Jan 26 15:18:00 compute-1 nova_compute[183403]: 2026-01-26 15:18:00.529 183407 DEBUG oslo_concurrency.lockutils [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:18:00 compute-1 nova_compute[183403]: 2026-01-26 15:18:00.530 183407 DEBUG oslo_concurrency.lockutils [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:18:00 compute-1 nova_compute[183403]: 2026-01-26 15:18:00.539 183407 DEBUG nova.virt.hardware [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Jan 26 15:18:00 compute-1 nova_compute[183403]: 2026-01-26 15:18:00.539 183407 INFO nova.compute.claims [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Claim successful on node compute-1.ctlplane.example.com
Jan 26 15:18:01 compute-1 nova_compute[183403]: 2026-01-26 15:18:01.617 183407 DEBUG nova.compute.provider_tree [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:18:01 compute-1 podman[208271]: 2026-01-26 15:18:01.874044621 +0000 UTC m=+0.053023120 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 15:18:01 compute-1 podman[208272]: 2026-01-26 15:18:01.881117309 +0000 UTC m=+0.054265022 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, managed_by=edpm_ansible, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, config_id=openstack_network_exporter, release=1755695350, version=9.6, com.redhat.component=ubi9-minimal-container, vcs-type=git, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 26 15:18:02 compute-1 nova_compute[183403]: 2026-01-26 15:18:02.128 183407 DEBUG nova.scheduler.client.report [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:18:02 compute-1 nova_compute[183403]: 2026-01-26 15:18:02.642 183407 DEBUG oslo_concurrency.lockutils [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.112s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:18:02 compute-1 nova_compute[183403]: 2026-01-26 15:18:02.643 183407 DEBUG nova.compute.manager [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Jan 26 15:18:03 compute-1 nova_compute[183403]: 2026-01-26 15:18:03.156 183407 DEBUG nova.compute.manager [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Jan 26 15:18:03 compute-1 nova_compute[183403]: 2026-01-26 15:18:03.156 183407 DEBUG nova.network.neutron [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Jan 26 15:18:03 compute-1 nova_compute[183403]: 2026-01-26 15:18:03.157 183407 WARNING neutronclient.v2_0.client [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:18:03 compute-1 nova_compute[183403]: 2026-01-26 15:18:03.157 183407 WARNING neutronclient.v2_0.client [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:18:03 compute-1 sshd-session[208212]: Disconnecting invalid user Administrator 185.246.128.170 port 21346: Change of username or service not allowed: (Administrator,ssh-connection) -> (vyos,ssh-connection) [preauth]
Jan 26 15:18:03 compute-1 nova_compute[183403]: 2026-01-26 15:18:03.664 183407 INFO nova.virt.libvirt.driver [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 15:18:03 compute-1 nova_compute[183403]: 2026-01-26 15:18:03.820 183407 DEBUG nova.network.neutron [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Successfully created port: 74d00ce8-8619-4c5a-a2f4-4018d57f1469 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Jan 26 15:18:03 compute-1 nova_compute[183403]: 2026-01-26 15:18:03.833 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:18:04 compute-1 nova_compute[183403]: 2026-01-26 15:18:04.172 183407 DEBUG nova.compute.manager [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Jan 26 15:18:04 compute-1 nova_compute[183403]: 2026-01-26 15:18:04.818 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:18:04 compute-1 nova_compute[183403]: 2026-01-26 15:18:04.851 183407 DEBUG nova.network.neutron [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Successfully updated port: 74d00ce8-8619-4c5a-a2f4-4018d57f1469 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Jan 26 15:18:04 compute-1 nova_compute[183403]: 2026-01-26 15:18:04.920 183407 DEBUG nova.compute.manager [req-d91347e1-ede2-40b5-9981-01dd3bd340fa req-570ea7b6-03c4-4939-98ed-fa54df7608c4 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Received event network-changed-74d00ce8-8619-4c5a-a2f4-4018d57f1469 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:18:04 compute-1 nova_compute[183403]: 2026-01-26 15:18:04.920 183407 DEBUG nova.compute.manager [req-d91347e1-ede2-40b5-9981-01dd3bd340fa req-570ea7b6-03c4-4939-98ed-fa54df7608c4 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Refreshing instance network info cache due to event network-changed-74d00ce8-8619-4c5a-a2f4-4018d57f1469. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 26 15:18:04 compute-1 nova_compute[183403]: 2026-01-26 15:18:04.920 183407 DEBUG oslo_concurrency.lockutils [req-d91347e1-ede2-40b5-9981-01dd3bd340fa req-570ea7b6-03c4-4939-98ed-fa54df7608c4 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "refresh_cache-c52ee407-1afb-4ae3-ae7f-792592e6badf" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 15:18:04 compute-1 nova_compute[183403]: 2026-01-26 15:18:04.920 183407 DEBUG oslo_concurrency.lockutils [req-d91347e1-ede2-40b5-9981-01dd3bd340fa req-570ea7b6-03c4-4939-98ed-fa54df7608c4 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquired lock "refresh_cache-c52ee407-1afb-4ae3-ae7f-792592e6badf" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 15:18:04 compute-1 nova_compute[183403]: 2026-01-26 15:18:04.921 183407 DEBUG nova.network.neutron [req-d91347e1-ede2-40b5-9981-01dd3bd340fa req-570ea7b6-03c4-4939-98ed-fa54df7608c4 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Refreshing network info cache for port 74d00ce8-8619-4c5a-a2f4-4018d57f1469 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 26 15:18:05 compute-1 nova_compute[183403]: 2026-01-26 15:18:05.194 183407 DEBUG nova.compute.manager [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Jan 26 15:18:05 compute-1 nova_compute[183403]: 2026-01-26 15:18:05.196 183407 DEBUG nova.virt.libvirt.driver [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Jan 26 15:18:05 compute-1 nova_compute[183403]: 2026-01-26 15:18:05.196 183407 INFO nova.virt.libvirt.driver [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Creating image(s)
Jan 26 15:18:05 compute-1 nova_compute[183403]: 2026-01-26 15:18:05.197 183407 DEBUG oslo_concurrency.lockutils [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Acquiring lock "/var/lib/nova/instances/c52ee407-1afb-4ae3-ae7f-792592e6badf/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:18:05 compute-1 nova_compute[183403]: 2026-01-26 15:18:05.198 183407 DEBUG oslo_concurrency.lockutils [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Lock "/var/lib/nova/instances/c52ee407-1afb-4ae3-ae7f-792592e6badf/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:18:05 compute-1 nova_compute[183403]: 2026-01-26 15:18:05.199 183407 DEBUG oslo_concurrency.lockutils [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Lock "/var/lib/nova/instances/c52ee407-1afb-4ae3-ae7f-792592e6badf/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:18:05 compute-1 nova_compute[183403]: 2026-01-26 15:18:05.200 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:18:05 compute-1 nova_compute[183403]: 2026-01-26 15:18:05.206 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:18:05 compute-1 nova_compute[183403]: 2026-01-26 15:18:05.209 183407 DEBUG oslo_concurrency.processutils [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:18:05 compute-1 nova_compute[183403]: 2026-01-26 15:18:05.282 183407 DEBUG oslo_concurrency.processutils [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:18:05 compute-1 nova_compute[183403]: 2026-01-26 15:18:05.283 183407 DEBUG oslo_concurrency.lockutils [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Acquiring lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:18:05 compute-1 nova_compute[183403]: 2026-01-26 15:18:05.284 183407 DEBUG oslo_concurrency.lockutils [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:18:05 compute-1 nova_compute[183403]: 2026-01-26 15:18:05.285 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:18:05 compute-1 nova_compute[183403]: 2026-01-26 15:18:05.288 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:18:05 compute-1 nova_compute[183403]: 2026-01-26 15:18:05.289 183407 DEBUG oslo_concurrency.processutils [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:18:05 compute-1 nova_compute[183403]: 2026-01-26 15:18:05.353 183407 DEBUG oslo_concurrency.processutils [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:18:05 compute-1 nova_compute[183403]: 2026-01-26 15:18:05.354 183407 DEBUG oslo_concurrency.processutils [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0,backing_fmt=raw /var/lib/nova/instances/c52ee407-1afb-4ae3-ae7f-792592e6badf/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:18:05 compute-1 nova_compute[183403]: 2026-01-26 15:18:05.361 183407 DEBUG oslo_concurrency.lockutils [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Acquiring lock "refresh_cache-c52ee407-1afb-4ae3-ae7f-792592e6badf" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 15:18:05 compute-1 nova_compute[183403]: 2026-01-26 15:18:05.388 183407 DEBUG oslo_concurrency.processutils [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0,backing_fmt=raw /var/lib/nova/instances/c52ee407-1afb-4ae3-ae7f-792592e6badf/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:18:05 compute-1 nova_compute[183403]: 2026-01-26 15:18:05.389 183407 DEBUG oslo_concurrency.lockutils [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.105s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:18:05 compute-1 nova_compute[183403]: 2026-01-26 15:18:05.389 183407 DEBUG oslo_concurrency.processutils [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:18:05 compute-1 nova_compute[183403]: 2026-01-26 15:18:05.428 183407 WARNING neutronclient.v2_0.client [req-d91347e1-ede2-40b5-9981-01dd3bd340fa req-570ea7b6-03c4-4939-98ed-fa54df7608c4 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:18:05 compute-1 nova_compute[183403]: 2026-01-26 15:18:05.472 183407 DEBUG oslo_concurrency.processutils [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:18:05 compute-1 nova_compute[183403]: 2026-01-26 15:18:05.473 183407 DEBUG nova.virt.disk.api [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Checking if we can resize image /var/lib/nova/instances/c52ee407-1afb-4ae3-ae7f-792592e6badf/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 26 15:18:05 compute-1 nova_compute[183403]: 2026-01-26 15:18:05.474 183407 DEBUG oslo_concurrency.processutils [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c52ee407-1afb-4ae3-ae7f-792592e6badf/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:18:05 compute-1 nova_compute[183403]: 2026-01-26 15:18:05.533 183407 DEBUG oslo_concurrency.processutils [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c52ee407-1afb-4ae3-ae7f-792592e6badf/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:18:05 compute-1 nova_compute[183403]: 2026-01-26 15:18:05.535 183407 DEBUG nova.virt.disk.api [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Cannot resize image /var/lib/nova/instances/c52ee407-1afb-4ae3-ae7f-792592e6badf/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 26 15:18:05 compute-1 nova_compute[183403]: 2026-01-26 15:18:05.536 183407 DEBUG nova.virt.libvirt.driver [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Jan 26 15:18:05 compute-1 nova_compute[183403]: 2026-01-26 15:18:05.536 183407 DEBUG nova.virt.libvirt.driver [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Ensure instance console log exists: /var/lib/nova/instances/c52ee407-1afb-4ae3-ae7f-792592e6badf/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Jan 26 15:18:05 compute-1 nova_compute[183403]: 2026-01-26 15:18:05.537 183407 DEBUG oslo_concurrency.lockutils [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:18:05 compute-1 nova_compute[183403]: 2026-01-26 15:18:05.538 183407 DEBUG oslo_concurrency.lockutils [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:18:05 compute-1 nova_compute[183403]: 2026-01-26 15:18:05.538 183407 DEBUG oslo_concurrency.lockutils [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:18:05 compute-1 nova_compute[183403]: 2026-01-26 15:18:05.575 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:18:05 compute-1 podman[192725]: time="2026-01-26T15:18:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:18:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:18:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:18:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:18:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2190 "" "Go-http-client/1.1"
Jan 26 15:18:06 compute-1 nova_compute[183403]: 2026-01-26 15:18:06.363 183407 DEBUG nova.network.neutron [req-d91347e1-ede2-40b5-9981-01dd3bd340fa req-570ea7b6-03c4-4939-98ed-fa54df7608c4 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 15:18:06 compute-1 nova_compute[183403]: 2026-01-26 15:18:06.496 183407 DEBUG nova.network.neutron [req-d91347e1-ede2-40b5-9981-01dd3bd340fa req-570ea7b6-03c4-4939-98ed-fa54df7608c4 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:18:07 compute-1 nova_compute[183403]: 2026-01-26 15:18:07.003 183407 DEBUG oslo_concurrency.lockutils [req-d91347e1-ede2-40b5-9981-01dd3bd340fa req-570ea7b6-03c4-4939-98ed-fa54df7608c4 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Releasing lock "refresh_cache-c52ee407-1afb-4ae3-ae7f-792592e6badf" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 15:18:07 compute-1 nova_compute[183403]: 2026-01-26 15:18:07.004 183407 DEBUG oslo_concurrency.lockutils [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Acquired lock "refresh_cache-c52ee407-1afb-4ae3-ae7f-792592e6badf" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 15:18:07 compute-1 nova_compute[183403]: 2026-01-26 15:18:07.004 183407 DEBUG nova.network.neutron [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 15:18:07 compute-1 nova_compute[183403]: 2026-01-26 15:18:07.643 183407 DEBUG nova.network.neutron [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 15:18:07 compute-1 nova_compute[183403]: 2026-01-26 15:18:07.833 183407 WARNING neutronclient.v2_0.client [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:18:08 compute-1 nova_compute[183403]: 2026-01-26 15:18:08.378 183407 DEBUG nova.network.neutron [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Updating instance_info_cache with network_info: [{"id": "74d00ce8-8619-4c5a-a2f4-4018d57f1469", "address": "fa:16:3e:f6:ca:e0", "network": {"id": "0df777d6-b389-44bc-b166-8208ab926234", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1259048565-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2d0174cdadae4803981796a2cea457d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74d00ce8-86", "ovs_interfaceid": "74d00ce8-8619-4c5a-a2f4-4018d57f1469", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:18:08 compute-1 nova_compute[183403]: 2026-01-26 15:18:08.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:18:08 compute-1 nova_compute[183403]: 2026-01-26 15:18:08.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:18:08 compute-1 nova_compute[183403]: 2026-01-26 15:18:08.871 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:18:08 compute-1 nova_compute[183403]: 2026-01-26 15:18:08.886 183407 DEBUG oslo_concurrency.lockutils [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Releasing lock "refresh_cache-c52ee407-1afb-4ae3-ae7f-792592e6badf" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 15:18:08 compute-1 nova_compute[183403]: 2026-01-26 15:18:08.887 183407 DEBUG nova.compute.manager [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Instance network_info: |[{"id": "74d00ce8-8619-4c5a-a2f4-4018d57f1469", "address": "fa:16:3e:f6:ca:e0", "network": {"id": "0df777d6-b389-44bc-b166-8208ab926234", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1259048565-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2d0174cdadae4803981796a2cea457d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74d00ce8-86", "ovs_interfaceid": "74d00ce8-8619-4c5a-a2f4-4018d57f1469", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Jan 26 15:18:08 compute-1 nova_compute[183403]: 2026-01-26 15:18:08.891 183407 DEBUG nova.virt.libvirt.driver [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Start _get_guest_xml network_info=[{"id": "74d00ce8-8619-4c5a-a2f4-4018d57f1469", "address": "fa:16:3e:f6:ca:e0", "network": {"id": "0df777d6-b389-44bc-b166-8208ab926234", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1259048565-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2d0174cdadae4803981796a2cea457d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74d00ce8-86", "ovs_interfaceid": "74d00ce8-8619-4c5a-a2f4-4018d57f1469", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T15:01:22Z,direct_url=<?>,disk_format='qcow2',id=354e4d0e-4287-404f-93d3-2c85cfe92fbc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='179f3c996d8f4e7ea1b0aca3ec76f02e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T15:01:23Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'image_id': '354e4d0e-4287-404f-93d3-2c85cfe92fbc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Jan 26 15:18:08 compute-1 nova_compute[183403]: 2026-01-26 15:18:08.898 183407 WARNING nova.virt.libvirt.driver [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:18:08 compute-1 nova_compute[183403]: 2026-01-26 15:18:08.900 183407 DEBUG nova.virt.driver [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='354e4d0e-4287-404f-93d3-2c85cfe92fbc', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteHostMaintenanceStrategy-server-1462342914', uuid='c52ee407-1afb-4ae3-ae7f-792592e6badf'), owner=OwnerMeta(userid='eabb3af6e41e4d9e883fc43bd03679db', username='tempest-TestExecuteHostMaintenanceStrategy-1844876463-project-admin', projectid='3ed11f66f0de4f6191def09f65c67624', projectname='tempest-TestExecuteHostMaintenanceStrategy-1844876463'), image=ImageMeta(id='354e4d0e-4287-404f-93d3-2c85cfe92fbc', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='74480e15-23e6-4569-8ef9-3ddf5ac8b981', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "74d00ce8-8619-4c5a-a2f4-4018d57f1469", "address": "fa:16:3e:f6:ca:e0", "network": {"id": "0df777d6-b389-44bc-b166-8208ab926234", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1259048565-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2d0174cdadae4803981796a2cea457d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74d00ce8-86", "ovs_interfaceid": "74d00ce8-8619-4c5a-a2f4-4018d57f1469", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1769440688.899988) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Jan 26 15:18:08 compute-1 nova_compute[183403]: 2026-01-26 15:18:08.905 183407 DEBUG nova.virt.libvirt.host [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Jan 26 15:18:08 compute-1 nova_compute[183403]: 2026-01-26 15:18:08.906 183407 DEBUG nova.virt.libvirt.host [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Jan 26 15:18:08 compute-1 nova_compute[183403]: 2026-01-26 15:18:08.909 183407 DEBUG nova.virt.libvirt.host [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Jan 26 15:18:08 compute-1 nova_compute[183403]: 2026-01-26 15:18:08.910 183407 DEBUG nova.virt.libvirt.host [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Jan 26 15:18:08 compute-1 nova_compute[183403]: 2026-01-26 15:18:08.912 183407 DEBUG nova.virt.libvirt.driver [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Jan 26 15:18:08 compute-1 nova_compute[183403]: 2026-01-26 15:18:08.912 183407 DEBUG nova.virt.hardware [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T15:01:21Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='74480e15-23e6-4569-8ef9-3ddf5ac8b981',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T15:01:22Z,direct_url=<?>,disk_format='qcow2',id=354e4d0e-4287-404f-93d3-2c85cfe92fbc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='179f3c996d8f4e7ea1b0aca3ec76f02e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T15:01:23Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Jan 26 15:18:08 compute-1 nova_compute[183403]: 2026-01-26 15:18:08.913 183407 DEBUG nova.virt.hardware [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Jan 26 15:18:08 compute-1 nova_compute[183403]: 2026-01-26 15:18:08.913 183407 DEBUG nova.virt.hardware [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Jan 26 15:18:08 compute-1 nova_compute[183403]: 2026-01-26 15:18:08.914 183407 DEBUG nova.virt.hardware [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Jan 26 15:18:08 compute-1 nova_compute[183403]: 2026-01-26 15:18:08.914 183407 DEBUG nova.virt.hardware [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Jan 26 15:18:08 compute-1 nova_compute[183403]: 2026-01-26 15:18:08.915 183407 DEBUG nova.virt.hardware [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Jan 26 15:18:08 compute-1 nova_compute[183403]: 2026-01-26 15:18:08.915 183407 DEBUG nova.virt.hardware [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Jan 26 15:18:08 compute-1 nova_compute[183403]: 2026-01-26 15:18:08.916 183407 DEBUG nova.virt.hardware [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Jan 26 15:18:08 compute-1 nova_compute[183403]: 2026-01-26 15:18:08.916 183407 DEBUG nova.virt.hardware [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Jan 26 15:18:08 compute-1 nova_compute[183403]: 2026-01-26 15:18:08.916 183407 DEBUG nova.virt.hardware [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Jan 26 15:18:08 compute-1 nova_compute[183403]: 2026-01-26 15:18:08.917 183407 DEBUG nova.virt.hardware [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Jan 26 15:18:08 compute-1 nova_compute[183403]: 2026-01-26 15:18:08.923 183407 DEBUG nova.virt.libvirt.vif [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-01-26T15:17:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1462342914',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1462342914',id=13,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3ed11f66f0de4f6191def09f65c67624',ramdisk_id='',reservation_id='r-eo9xbxqw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1844876463',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1844876463-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T15:18:04Z,user_data=None,user_id='eabb3af6e41e4d9e883fc43bd03679db',uuid=c52ee407-1afb-4ae3-ae7f-792592e6badf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "74d00ce8-8619-4c5a-a2f4-4018d57f1469", "address": "fa:16:3e:f6:ca:e0", "network": {"id": "0df777d6-b389-44bc-b166-8208ab926234", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1259048565-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2d0174cdadae4803981796a2cea457d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74d00ce8-86", "ovs_interfaceid": "74d00ce8-8619-4c5a-a2f4-4018d57f1469", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 26 15:18:08 compute-1 nova_compute[183403]: 2026-01-26 15:18:08.924 183407 DEBUG nova.network.os_vif_util [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Converting VIF {"id": "74d00ce8-8619-4c5a-a2f4-4018d57f1469", "address": "fa:16:3e:f6:ca:e0", "network": {"id": "0df777d6-b389-44bc-b166-8208ab926234", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1259048565-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2d0174cdadae4803981796a2cea457d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74d00ce8-86", "ovs_interfaceid": "74d00ce8-8619-4c5a-a2f4-4018d57f1469", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:18:08 compute-1 nova_compute[183403]: 2026-01-26 15:18:08.925 183407 DEBUG nova.network.os_vif_util [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f6:ca:e0,bridge_name='br-int',has_traffic_filtering=True,id=74d00ce8-8619-4c5a-a2f4-4018d57f1469,network=Network(0df777d6-b389-44bc-b166-8208ab926234),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap74d00ce8-86') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:18:08 compute-1 nova_compute[183403]: 2026-01-26 15:18:08.926 183407 DEBUG nova.objects.instance [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Lazy-loading 'pci_devices' on Instance uuid c52ee407-1afb-4ae3-ae7f-792592e6badf obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:18:09 compute-1 nova_compute[183403]: 2026-01-26 15:18:09.088 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:18:09 compute-1 nova_compute[183403]: 2026-01-26 15:18:09.088 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:18:09 compute-1 nova_compute[183403]: 2026-01-26 15:18:09.088 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:18:09 compute-1 nova_compute[183403]: 2026-01-26 15:18:09.088 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:18:09 compute-1 nova_compute[183403]: 2026-01-26 15:18:09.264 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:18:09 compute-1 nova_compute[183403]: 2026-01-26 15:18:09.265 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:18:09 compute-1 nova_compute[183403]: 2026-01-26 15:18:09.293 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.027s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:18:09 compute-1 nova_compute[183403]: 2026-01-26 15:18:09.294 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5838MB free_disk=73.14859390258789GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:18:09 compute-1 nova_compute[183403]: 2026-01-26 15:18:09.294 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:18:09 compute-1 nova_compute[183403]: 2026-01-26 15:18:09.295 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:18:09 compute-1 nova_compute[183403]: 2026-01-26 15:18:09.436 183407 DEBUG nova.virt.libvirt.driver [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] End _get_guest_xml xml=<domain type="kvm">
Jan 26 15:18:09 compute-1 nova_compute[183403]:   <uuid>c52ee407-1afb-4ae3-ae7f-792592e6badf</uuid>
Jan 26 15:18:09 compute-1 nova_compute[183403]:   <name>instance-0000000d</name>
Jan 26 15:18:09 compute-1 nova_compute[183403]:   <memory>131072</memory>
Jan 26 15:18:09 compute-1 nova_compute[183403]:   <vcpu>1</vcpu>
Jan 26 15:18:09 compute-1 nova_compute[183403]:   <metadata>
Jan 26 15:18:09 compute-1 nova_compute[183403]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 15:18:09 compute-1 nova_compute[183403]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 15:18:09 compute-1 nova_compute[183403]:       <nova:name>tempest-TestExecuteHostMaintenanceStrategy-server-1462342914</nova:name>
Jan 26 15:18:09 compute-1 nova_compute[183403]:       <nova:creationTime>2026-01-26 15:18:08</nova:creationTime>
Jan 26 15:18:09 compute-1 nova_compute[183403]:       <nova:flavor name="m1.nano" id="74480e15-23e6-4569-8ef9-3ddf5ac8b981">
Jan 26 15:18:09 compute-1 nova_compute[183403]:         <nova:memory>128</nova:memory>
Jan 26 15:18:09 compute-1 nova_compute[183403]:         <nova:disk>1</nova:disk>
Jan 26 15:18:09 compute-1 nova_compute[183403]:         <nova:swap>0</nova:swap>
Jan 26 15:18:09 compute-1 nova_compute[183403]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 15:18:09 compute-1 nova_compute[183403]:         <nova:vcpus>1</nova:vcpus>
Jan 26 15:18:09 compute-1 nova_compute[183403]:         <nova:extraSpecs>
Jan 26 15:18:09 compute-1 nova_compute[183403]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 15:18:09 compute-1 nova_compute[183403]:         </nova:extraSpecs>
Jan 26 15:18:09 compute-1 nova_compute[183403]:       </nova:flavor>
Jan 26 15:18:09 compute-1 nova_compute[183403]:       <nova:image uuid="354e4d0e-4287-404f-93d3-2c85cfe92fbc">
Jan 26 15:18:09 compute-1 nova_compute[183403]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 15:18:09 compute-1 nova_compute[183403]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 15:18:09 compute-1 nova_compute[183403]:         <nova:minDisk>1</nova:minDisk>
Jan 26 15:18:09 compute-1 nova_compute[183403]:         <nova:minRam>0</nova:minRam>
Jan 26 15:18:09 compute-1 nova_compute[183403]:         <nova:properties>
Jan 26 15:18:09 compute-1 nova_compute[183403]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 15:18:09 compute-1 nova_compute[183403]:         </nova:properties>
Jan 26 15:18:09 compute-1 nova_compute[183403]:       </nova:image>
Jan 26 15:18:09 compute-1 nova_compute[183403]:       <nova:owner>
Jan 26 15:18:09 compute-1 nova_compute[183403]:         <nova:user uuid="eabb3af6e41e4d9e883fc43bd03679db">tempest-TestExecuteHostMaintenanceStrategy-1844876463-project-admin</nova:user>
Jan 26 15:18:09 compute-1 nova_compute[183403]:         <nova:project uuid="3ed11f66f0de4f6191def09f65c67624">tempest-TestExecuteHostMaintenanceStrategy-1844876463</nova:project>
Jan 26 15:18:09 compute-1 nova_compute[183403]:       </nova:owner>
Jan 26 15:18:09 compute-1 nova_compute[183403]:       <nova:root type="image" uuid="354e4d0e-4287-404f-93d3-2c85cfe92fbc"/>
Jan 26 15:18:09 compute-1 nova_compute[183403]:       <nova:ports>
Jan 26 15:18:09 compute-1 nova_compute[183403]:         <nova:port uuid="74d00ce8-8619-4c5a-a2f4-4018d57f1469">
Jan 26 15:18:09 compute-1 nova_compute[183403]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 26 15:18:09 compute-1 nova_compute[183403]:         </nova:port>
Jan 26 15:18:09 compute-1 nova_compute[183403]:       </nova:ports>
Jan 26 15:18:09 compute-1 nova_compute[183403]:     </nova:instance>
Jan 26 15:18:09 compute-1 nova_compute[183403]:   </metadata>
Jan 26 15:18:09 compute-1 nova_compute[183403]:   <sysinfo type="smbios">
Jan 26 15:18:09 compute-1 nova_compute[183403]:     <system>
Jan 26 15:18:09 compute-1 nova_compute[183403]:       <entry name="manufacturer">RDO</entry>
Jan 26 15:18:09 compute-1 nova_compute[183403]:       <entry name="product">OpenStack Compute</entry>
Jan 26 15:18:09 compute-1 nova_compute[183403]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 15:18:09 compute-1 nova_compute[183403]:       <entry name="serial">c52ee407-1afb-4ae3-ae7f-792592e6badf</entry>
Jan 26 15:18:09 compute-1 nova_compute[183403]:       <entry name="uuid">c52ee407-1afb-4ae3-ae7f-792592e6badf</entry>
Jan 26 15:18:09 compute-1 nova_compute[183403]:       <entry name="family">Virtual Machine</entry>
Jan 26 15:18:09 compute-1 nova_compute[183403]:     </system>
Jan 26 15:18:09 compute-1 nova_compute[183403]:   </sysinfo>
Jan 26 15:18:09 compute-1 nova_compute[183403]:   <os>
Jan 26 15:18:09 compute-1 nova_compute[183403]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 15:18:09 compute-1 nova_compute[183403]:     <boot dev="hd"/>
Jan 26 15:18:09 compute-1 nova_compute[183403]:     <smbios mode="sysinfo"/>
Jan 26 15:18:09 compute-1 nova_compute[183403]:   </os>
Jan 26 15:18:09 compute-1 nova_compute[183403]:   <features>
Jan 26 15:18:09 compute-1 nova_compute[183403]:     <acpi/>
Jan 26 15:18:09 compute-1 nova_compute[183403]:     <apic/>
Jan 26 15:18:09 compute-1 nova_compute[183403]:     <vmcoreinfo/>
Jan 26 15:18:09 compute-1 nova_compute[183403]:   </features>
Jan 26 15:18:09 compute-1 nova_compute[183403]:   <clock offset="utc">
Jan 26 15:18:09 compute-1 nova_compute[183403]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 15:18:09 compute-1 nova_compute[183403]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 15:18:09 compute-1 nova_compute[183403]:     <timer name="hpet" present="no"/>
Jan 26 15:18:09 compute-1 nova_compute[183403]:   </clock>
Jan 26 15:18:09 compute-1 nova_compute[183403]:   <cpu mode="custom" match="exact">
Jan 26 15:18:09 compute-1 nova_compute[183403]:     <model>Nehalem</model>
Jan 26 15:18:09 compute-1 nova_compute[183403]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 15:18:09 compute-1 nova_compute[183403]:   </cpu>
Jan 26 15:18:09 compute-1 nova_compute[183403]:   <devices>
Jan 26 15:18:09 compute-1 nova_compute[183403]:     <disk type="file" device="disk">
Jan 26 15:18:09 compute-1 nova_compute[183403]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 15:18:09 compute-1 nova_compute[183403]:       <source file="/var/lib/nova/instances/c52ee407-1afb-4ae3-ae7f-792592e6badf/disk"/>
Jan 26 15:18:09 compute-1 nova_compute[183403]:       <target dev="vda" bus="virtio"/>
Jan 26 15:18:09 compute-1 nova_compute[183403]:     </disk>
Jan 26 15:18:09 compute-1 nova_compute[183403]:     <disk type="file" device="cdrom">
Jan 26 15:18:09 compute-1 nova_compute[183403]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 15:18:09 compute-1 nova_compute[183403]:       <source file="/var/lib/nova/instances/c52ee407-1afb-4ae3-ae7f-792592e6badf/disk.config"/>
Jan 26 15:18:09 compute-1 nova_compute[183403]:       <target dev="sda" bus="sata"/>
Jan 26 15:18:09 compute-1 nova_compute[183403]:     </disk>
Jan 26 15:18:09 compute-1 nova_compute[183403]:     <interface type="ethernet">
Jan 26 15:18:09 compute-1 nova_compute[183403]:       <mac address="fa:16:3e:f6:ca:e0"/>
Jan 26 15:18:09 compute-1 nova_compute[183403]:       <model type="virtio"/>
Jan 26 15:18:09 compute-1 nova_compute[183403]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 15:18:09 compute-1 nova_compute[183403]:       <mtu size="1442"/>
Jan 26 15:18:09 compute-1 nova_compute[183403]:       <target dev="tap74d00ce8-86"/>
Jan 26 15:18:09 compute-1 nova_compute[183403]:     </interface>
Jan 26 15:18:09 compute-1 nova_compute[183403]:     <serial type="pty">
Jan 26 15:18:09 compute-1 nova_compute[183403]:       <log file="/var/lib/nova/instances/c52ee407-1afb-4ae3-ae7f-792592e6badf/console.log" append="off"/>
Jan 26 15:18:09 compute-1 nova_compute[183403]:     </serial>
Jan 26 15:18:09 compute-1 nova_compute[183403]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 15:18:09 compute-1 nova_compute[183403]:     <video>
Jan 26 15:18:09 compute-1 nova_compute[183403]:       <model type="virtio"/>
Jan 26 15:18:09 compute-1 nova_compute[183403]:     </video>
Jan 26 15:18:09 compute-1 nova_compute[183403]:     <input type="tablet" bus="usb"/>
Jan 26 15:18:09 compute-1 nova_compute[183403]:     <rng model="virtio">
Jan 26 15:18:09 compute-1 nova_compute[183403]:       <backend model="random">/dev/urandom</backend>
Jan 26 15:18:09 compute-1 nova_compute[183403]:     </rng>
Jan 26 15:18:09 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root"/>
Jan 26 15:18:09 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:18:09 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:18:09 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:18:09 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:18:09 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:18:09 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:18:09 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:18:09 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:18:09 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:18:09 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:18:09 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:18:09 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:18:09 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:18:09 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:18:09 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:18:09 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:18:09 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:18:09 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:18:09 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:18:09 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:18:09 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:18:09 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:18:09 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:18:09 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:18:09 compute-1 nova_compute[183403]:     <controller type="usb" index="0"/>
Jan 26 15:18:09 compute-1 nova_compute[183403]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 15:18:09 compute-1 nova_compute[183403]:       <stats period="10"/>
Jan 26 15:18:09 compute-1 nova_compute[183403]:     </memballoon>
Jan 26 15:18:09 compute-1 nova_compute[183403]:   </devices>
Jan 26 15:18:09 compute-1 nova_compute[183403]: </domain>
Jan 26 15:18:09 compute-1 nova_compute[183403]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Jan 26 15:18:09 compute-1 nova_compute[183403]: 2026-01-26 15:18:09.438 183407 DEBUG nova.compute.manager [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Preparing to wait for external event network-vif-plugged-74d00ce8-8619-4c5a-a2f4-4018d57f1469 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 26 15:18:09 compute-1 nova_compute[183403]: 2026-01-26 15:18:09.438 183407 DEBUG oslo_concurrency.lockutils [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Acquiring lock "c52ee407-1afb-4ae3-ae7f-792592e6badf-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:18:09 compute-1 nova_compute[183403]: 2026-01-26 15:18:09.439 183407 DEBUG oslo_concurrency.lockutils [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Lock "c52ee407-1afb-4ae3-ae7f-792592e6badf-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:18:09 compute-1 nova_compute[183403]: 2026-01-26 15:18:09.439 183407 DEBUG oslo_concurrency.lockutils [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Lock "c52ee407-1afb-4ae3-ae7f-792592e6badf-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:18:09 compute-1 nova_compute[183403]: 2026-01-26 15:18:09.440 183407 DEBUG nova.virt.libvirt.vif [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-01-26T15:17:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1462342914',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1462342914',id=13,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3ed11f66f0de4f6191def09f65c67624',ramdisk_id='',reservation_id='r-eo9xbxqw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1844876463',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1844876463-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T15:18:04Z,user_data=None,user_id='eabb3af6e41e4d9e883fc43bd03679db',uuid=c52ee407-1afb-4ae3-ae7f-792592e6badf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "74d00ce8-8619-4c5a-a2f4-4018d57f1469", "address": "fa:16:3e:f6:ca:e0", "network": {"id": "0df777d6-b389-44bc-b166-8208ab926234", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1259048565-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2d0174cdadae4803981796a2cea457d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74d00ce8-86", "ovs_interfaceid": "74d00ce8-8619-4c5a-a2f4-4018d57f1469", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 26 15:18:09 compute-1 nova_compute[183403]: 2026-01-26 15:18:09.441 183407 DEBUG nova.network.os_vif_util [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Converting VIF {"id": "74d00ce8-8619-4c5a-a2f4-4018d57f1469", "address": "fa:16:3e:f6:ca:e0", "network": {"id": "0df777d6-b389-44bc-b166-8208ab926234", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1259048565-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2d0174cdadae4803981796a2cea457d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74d00ce8-86", "ovs_interfaceid": "74d00ce8-8619-4c5a-a2f4-4018d57f1469", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:18:09 compute-1 nova_compute[183403]: 2026-01-26 15:18:09.442 183407 DEBUG nova.network.os_vif_util [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f6:ca:e0,bridge_name='br-int',has_traffic_filtering=True,id=74d00ce8-8619-4c5a-a2f4-4018d57f1469,network=Network(0df777d6-b389-44bc-b166-8208ab926234),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap74d00ce8-86') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:18:09 compute-1 nova_compute[183403]: 2026-01-26 15:18:09.442 183407 DEBUG os_vif [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f6:ca:e0,bridge_name='br-int',has_traffic_filtering=True,id=74d00ce8-8619-4c5a-a2f4-4018d57f1469,network=Network(0df777d6-b389-44bc-b166-8208ab926234),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap74d00ce8-86') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 26 15:18:09 compute-1 nova_compute[183403]: 2026-01-26 15:18:09.443 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:18:09 compute-1 nova_compute[183403]: 2026-01-26 15:18:09.444 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:18:09 compute-1 nova_compute[183403]: 2026-01-26 15:18:09.444 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:18:09 compute-1 nova_compute[183403]: 2026-01-26 15:18:09.445 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:18:09 compute-1 nova_compute[183403]: 2026-01-26 15:18:09.446 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '202b097d-89c1-514a-a39f-6d91c1a09ab5', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:18:09 compute-1 nova_compute[183403]: 2026-01-26 15:18:09.447 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:18:09 compute-1 nova_compute[183403]: 2026-01-26 15:18:09.449 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:18:09 compute-1 nova_compute[183403]: 2026-01-26 15:18:09.450 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:18:09 compute-1 nova_compute[183403]: 2026-01-26 15:18:09.454 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:18:09 compute-1 nova_compute[183403]: 2026-01-26 15:18:09.454 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap74d00ce8-86, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:18:09 compute-1 nova_compute[183403]: 2026-01-26 15:18:09.455 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap74d00ce8-86, col_values=(('qos', UUID('254ebe47-0260-4575-83dc-8097159bd14e')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:18:09 compute-1 nova_compute[183403]: 2026-01-26 15:18:09.455 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap74d00ce8-86, col_values=(('external_ids', {'iface-id': '74d00ce8-8619-4c5a-a2f4-4018d57f1469', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f6:ca:e0', 'vm-uuid': 'c52ee407-1afb-4ae3-ae7f-792592e6badf'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:18:09 compute-1 nova_compute[183403]: 2026-01-26 15:18:09.457 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:18:09 compute-1 NetworkManager[55716]: <info>  [1769440689.4581] manager: (tap74d00ce8-86): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/43)
Jan 26 15:18:09 compute-1 nova_compute[183403]: 2026-01-26 15:18:09.459 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:18:09 compute-1 nova_compute[183403]: 2026-01-26 15:18:09.462 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:18:09 compute-1 nova_compute[183403]: 2026-01-26 15:18:09.462 183407 INFO os_vif [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f6:ca:e0,bridge_name='br-int',has_traffic_filtering=True,id=74d00ce8-8619-4c5a-a2f4-4018d57f1469,network=Network(0df777d6-b389-44bc-b166-8208ab926234),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap74d00ce8-86')
Jan 26 15:18:09 compute-1 nova_compute[183403]: 2026-01-26 15:18:09.821 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:18:10 compute-1 nova_compute[183403]: 2026-01-26 15:18:10.344 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Instance c52ee407-1afb-4ae3-ae7f-792592e6badf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 15:18:10 compute-1 nova_compute[183403]: 2026-01-26 15:18:10.344 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:18:10 compute-1 nova_compute[183403]: 2026-01-26 15:18:10.345 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:18:09 up  1:13,  0 user,  load average: 0.08, 0.15, 0.28\n', 'num_instances': '1', 'num_vm_building': '1', 'num_task_spawning': '1', 'num_os_type_None': '1', 'num_proj_3ed11f66f0de4f6191def09f65c67624': '1', 'io_workload': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:18:10 compute-1 nova_compute[183403]: 2026-01-26 15:18:10.443 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:18:10 compute-1 nova_compute[183403]: 2026-01-26 15:18:10.956 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:18:11 compute-1 nova_compute[183403]: 2026-01-26 15:18:11.014 183407 DEBUG nova.virt.libvirt.driver [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 15:18:11 compute-1 nova_compute[183403]: 2026-01-26 15:18:11.014 183407 DEBUG nova.virt.libvirt.driver [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 15:18:11 compute-1 nova_compute[183403]: 2026-01-26 15:18:11.015 183407 DEBUG nova.virt.libvirt.driver [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] No VIF found with MAC fa:16:3e:f6:ca:e0, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Jan 26 15:18:11 compute-1 nova_compute[183403]: 2026-01-26 15:18:11.016 183407 INFO nova.virt.libvirt.driver [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Using config drive
Jan 26 15:18:11 compute-1 sshd-session[208327]: Invalid user vyos from 185.246.128.170 port 11953
Jan 26 15:18:11 compute-1 podman[208338]: 2026-01-26 15:18:11.428767652 +0000 UTC m=+0.074945130 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20260120, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Jan 26 15:18:11 compute-1 podman[208337]: 2026-01-26 15:18:11.465865032 +0000 UTC m=+0.126314258 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 26 15:18:11 compute-1 nova_compute[183403]: 2026-01-26 15:18:11.467 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:18:11 compute-1 nova_compute[183403]: 2026-01-26 15:18:11.468 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.172s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:18:11 compute-1 nova_compute[183403]: 2026-01-26 15:18:11.530 183407 WARNING neutronclient.v2_0.client [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:18:11 compute-1 sshd-session[208327]: Connection closed by invalid user vyos 185.246.128.170 port 11953 [preauth]
Jan 26 15:18:11 compute-1 nova_compute[183403]: 2026-01-26 15:18:11.754 183407 INFO nova.virt.libvirt.driver [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Creating config drive at /var/lib/nova/instances/c52ee407-1afb-4ae3-ae7f-792592e6badf/disk.config
Jan 26 15:18:11 compute-1 nova_compute[183403]: 2026-01-26 15:18:11.764 183407 DEBUG oslo_concurrency.processutils [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c52ee407-1afb-4ae3-ae7f-792592e6badf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpd4x6gvnc execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:18:11 compute-1 nova_compute[183403]: 2026-01-26 15:18:11.893 183407 DEBUG oslo_concurrency.processutils [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c52ee407-1afb-4ae3-ae7f-792592e6badf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpd4x6gvnc" returned: 0 in 0.129s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:18:11 compute-1 kernel: tap74d00ce8-86: entered promiscuous mode
Jan 26 15:18:11 compute-1 NetworkManager[55716]: <info>  [1769440691.9724] manager: (tap74d00ce8-86): new Tun device (/org/freedesktop/NetworkManager/Devices/44)
Jan 26 15:18:11 compute-1 ovn_controller[95641]: 2026-01-26T15:18:11Z|00106|binding|INFO|Claiming lport 74d00ce8-8619-4c5a-a2f4-4018d57f1469 for this chassis.
Jan 26 15:18:11 compute-1 ovn_controller[95641]: 2026-01-26T15:18:11Z|00107|binding|INFO|74d00ce8-8619-4c5a-a2f4-4018d57f1469: Claiming fa:16:3e:f6:ca:e0 10.100.0.11
Jan 26 15:18:11 compute-1 nova_compute[183403]: 2026-01-26 15:18:11.972 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:18:11 compute-1 nova_compute[183403]: 2026-01-26 15:18:11.975 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:18:11 compute-1 nova_compute[183403]: 2026-01-26 15:18:11.984 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:18:11 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:18:11.997 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f6:ca:e0 10.100.0.11'], port_security=['fa:16:3e:f6:ca:e0 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'c52ee407-1afb-4ae3-ae7f-792592e6badf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0df777d6-b389-44bc-b166-8208ab926234', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ed11f66f0de4f6191def09f65c67624', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e54c8a16-0c91-4c9d-aa33-a10f9396fada', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ffea2f62-0986-47bd-a80c-89f1c9decf3f, chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], logical_port=74d00ce8-8619-4c5a-a2f4-4018d57f1469) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:18:11 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:18:11.998 104930 INFO neutron.agent.ovn.metadata.agent [-] Port 74d00ce8-8619-4c5a-a2f4-4018d57f1469 in datapath 0df777d6-b389-44bc-b166-8208ab926234 bound to our chassis
Jan 26 15:18:11 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:18:11.999 104930 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0df777d6-b389-44bc-b166-8208ab926234
Jan 26 15:18:12 compute-1 systemd-udevd[208396]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:18:12.012 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[2ff07914-8393-4f92-8881-86066124cb02]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:18:12.013 104930 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0df777d6-b1 in ovnmeta-0df777d6-b389-44bc-b166-8208ab926234 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:18:12.015 203506 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0df777d6-b0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:18:12.015 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[2773eebd-1e1e-41a5-a75f-df8b99ba883a]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:18:12.015 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[90796c87-15f7-4db7-8d10-def5011a4afc]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:18:12 compute-1 systemd-machined[154697]: New machine qemu-9-instance-0000000d.
Jan 26 15:18:12 compute-1 NetworkManager[55716]: <info>  [1769440692.0244] device (tap74d00ce8-86): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 15:18:12 compute-1 NetworkManager[55716]: <info>  [1769440692.0250] device (tap74d00ce8-86): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:18:12.030 105448 DEBUG oslo.privsep.daemon [-] privsep: reply[fcbef369-be19-467a-8703-e8636b8bb928]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:18:12 compute-1 nova_compute[183403]: 2026-01-26 15:18:12.036 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:18:12 compute-1 ovn_controller[95641]: 2026-01-26T15:18:12Z|00108|binding|INFO|Setting lport 74d00ce8-8619-4c5a-a2f4-4018d57f1469 ovn-installed in OVS
Jan 26 15:18:12 compute-1 ovn_controller[95641]: 2026-01-26T15:18:12Z|00109|binding|INFO|Setting lport 74d00ce8-8619-4c5a-a2f4-4018d57f1469 up in Southbound
Jan 26 15:18:12 compute-1 systemd[1]: Started Virtual Machine qemu-9-instance-0000000d.
Jan 26 15:18:12 compute-1 nova_compute[183403]: 2026-01-26 15:18:12.043 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:18:12.048 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[f53565ca-aa95-455a-96ae-aa8140e39fd1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:18:12.078 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[9c53ea93-dd5f-418e-a556-b861a5f9afa8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:18:12.084 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[770b3dfd-a3f3-4476-95c9-0a1e17000be3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:18:12 compute-1 systemd-udevd[208401]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 15:18:12 compute-1 NetworkManager[55716]: <info>  [1769440692.0851] manager: (tap0df777d6-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/45)
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:18:12.125 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[0bd9ca62-7a5c-4331-ad6a-07539740eec6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:18:12.127 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[174b5829-2608-41c0-9944-3c2ebef8a9ff]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:18:12 compute-1 NetworkManager[55716]: <info>  [1769440692.1527] device (tap0df777d6-b0): carrier: link connected
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:18:12.159 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[1be8c418-c364-4be5-a7c4-e91076f9e1ec]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:18:12.173 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[6f2e4ba7-52b7-42cf-9738-10c4c5141642]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0df777d6-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:09:dc:c5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 31], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 441475, 'reachable_time': 40452, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 208430, 'error': None, 'target': 'ovnmeta-0df777d6-b389-44bc-b166-8208ab926234', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:18:12.189 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[4b945f2f-84ff-400e-a6c7-52beb4a237d4]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe09:dcc5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 441475, 'tstamp': 441475}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 208432, 'error': None, 'target': 'ovnmeta-0df777d6-b389-44bc-b166-8208ab926234', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:18:12.208 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[594a2418-1dc1-4b1e-9e0c-288c86bcc63a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0df777d6-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:09:dc:c5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 31], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 441475, 'reachable_time': 40452, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 208433, 'error': None, 'target': 'ovnmeta-0df777d6-b389-44bc-b166-8208ab926234', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:18:12.238 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[b6f7e3ec-9140-4ce6-8ce2-ad50de455e3b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:18:12 compute-1 nova_compute[183403]: 2026-01-26 15:18:12.249 183407 DEBUG nova.compute.manager [req-3a6b5aa6-a6b4-4625-915a-70a14dd946dd req-8d261616-8f92-460d-9899-6d9c3fd479cb 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Received event network-vif-plugged-74d00ce8-8619-4c5a-a2f4-4018d57f1469 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:18:12 compute-1 nova_compute[183403]: 2026-01-26 15:18:12.250 183407 DEBUG oslo_concurrency.lockutils [req-3a6b5aa6-a6b4-4625-915a-70a14dd946dd req-8d261616-8f92-460d-9899-6d9c3fd479cb 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "c52ee407-1afb-4ae3-ae7f-792592e6badf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:18:12 compute-1 nova_compute[183403]: 2026-01-26 15:18:12.250 183407 DEBUG oslo_concurrency.lockutils [req-3a6b5aa6-a6b4-4625-915a-70a14dd946dd req-8d261616-8f92-460d-9899-6d9c3fd479cb 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "c52ee407-1afb-4ae3-ae7f-792592e6badf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:18:12 compute-1 nova_compute[183403]: 2026-01-26 15:18:12.251 183407 DEBUG oslo_concurrency.lockutils [req-3a6b5aa6-a6b4-4625-915a-70a14dd946dd req-8d261616-8f92-460d-9899-6d9c3fd479cb 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "c52ee407-1afb-4ae3-ae7f-792592e6badf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:18:12 compute-1 nova_compute[183403]: 2026-01-26 15:18:12.251 183407 DEBUG nova.compute.manager [req-3a6b5aa6-a6b4-4625-915a-70a14dd946dd req-8d261616-8f92-460d-9899-6d9c3fd479cb 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Processing event network-vif-plugged-74d00ce8-8619-4c5a-a2f4-4018d57f1469 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:18:12.270 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:ac:38', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'de:96:40:90:f3:e5'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:18:12.306 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[4e858491-5a45-4768-b03f-44ee75406f5e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:18:12.306 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0df777d6-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:18:12.307 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:18:12.307 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0df777d6-b0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:18:12 compute-1 nova_compute[183403]: 2026-01-26 15:18:12.310 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:18:12 compute-1 NetworkManager[55716]: <info>  [1769440692.3110] manager: (tap0df777d6-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/46)
Jan 26 15:18:12 compute-1 kernel: tap0df777d6-b0: entered promiscuous mode
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:18:12.313 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0df777d6-b0, col_values=(('external_ids', {'iface-id': '33751b84-abc7-465c-a58b-58ca2b0cbc0a'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:18:12 compute-1 ovn_controller[95641]: 2026-01-26T15:18:12Z|00110|binding|INFO|Releasing lport 33751b84-abc7-465c-a58b-58ca2b0cbc0a from this chassis (sb_readonly=0)
Jan 26 15:18:12 compute-1 nova_compute[183403]: 2026-01-26 15:18:12.338 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:18:12.342 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[9cbdb678-3f74-4877-af8f-dc13a0b83db2]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:18:12.343 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0df777d6-b389-44bc-b166-8208ab926234.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0df777d6-b389-44bc-b166-8208ab926234.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:18:12.344 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0df777d6-b389-44bc-b166-8208ab926234.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0df777d6-b389-44bc-b166-8208ab926234.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:18:12.344 104930 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 0df777d6-b389-44bc-b166-8208ab926234 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:18:12.344 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0df777d6-b389-44bc-b166-8208ab926234.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0df777d6-b389-44bc-b166-8208ab926234.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:18:12.345 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[23f62b9d-157b-40d2-a608-cbe20fa7c829]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:18:12.345 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0df777d6-b389-44bc-b166-8208ab926234.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0df777d6-b389-44bc-b166-8208ab926234.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:18:12.346 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[cc427607-0d51-45f3-82c5-aa46dc40b919]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:18:12.346 104930 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]: global
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]:     log         /dev/log local0 debug
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]:     log-tag     haproxy-metadata-proxy-0df777d6-b389-44bc-b166-8208ab926234
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]:     user        root
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]:     group       root
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]:     maxconn     1024
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]:     pidfile     /var/lib/neutron/external/pids/0df777d6-b389-44bc-b166-8208ab926234.pid.haproxy
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]:     daemon
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]: 
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]: defaults
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]:     log global
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]:     mode http
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]:     option httplog
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]:     option dontlognull
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]:     option http-server-close
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]:     option forwardfor
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]:     retries                 3
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]:     timeout http-request    30s
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]:     timeout connect         30s
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]:     timeout client          32s
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]:     timeout server          32s
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]:     timeout http-keep-alive 30s
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]: 
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]: listen listener
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]:     bind 169.254.169.254:80
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]:     
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]:     server metadata /var/lib/neutron/metadata_proxy
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]: 
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]:     http-request add-header X-OVN-Network-ID 0df777d6-b389-44bc-b166-8208ab926234
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:18:12.348 104930 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0df777d6-b389-44bc-b166-8208ab926234', 'env', 'PROCESS_TAG=haproxy-0df777d6-b389-44bc-b166-8208ab926234', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0df777d6-b389-44bc-b166-8208ab926234.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Jan 26 15:18:12 compute-1 nova_compute[183403]: 2026-01-26 15:18:12.380 183407 DEBUG nova.compute.manager [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 26 15:18:12 compute-1 nova_compute[183403]: 2026-01-26 15:18:12.385 183407 DEBUG nova.virt.libvirt.driver [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Jan 26 15:18:12 compute-1 nova_compute[183403]: 2026-01-26 15:18:12.390 183407 INFO nova.virt.libvirt.driver [-] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Instance spawned successfully.
Jan 26 15:18:12 compute-1 nova_compute[183403]: 2026-01-26 15:18:12.391 183407 DEBUG nova.virt.libvirt.driver [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Jan 26 15:18:12 compute-1 podman[208473]: 2026-01-26 15:18:12.760937922 +0000 UTC m=+0.071859543 container create acf5f2eb61d24fcef76d2067b9156b2d1875c0c1b8909012e4f453d596c9bb08 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-0df777d6-b389-44bc-b166-8208ab926234, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 26 15:18:12 compute-1 systemd[1]: Started libpod-conmon-acf5f2eb61d24fcef76d2067b9156b2d1875c0c1b8909012e4f453d596c9bb08.scope.
Jan 26 15:18:12 compute-1 podman[208473]: 2026-01-26 15:18:12.730941839 +0000 UTC m=+0.041863470 image pull d5bf96c5225682608353c2a38183b39c74c7c48343b54a579b3b6f3d81996637 38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Jan 26 15:18:12 compute-1 systemd[1]: Started libcrun container.
Jan 26 15:18:12 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/54c90fd9e4f14f86a1e7485a0b062957c677212fb9460da19d06ec1155e35802/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 15:18:12 compute-1 podman[208473]: 2026-01-26 15:18:12.879202997 +0000 UTC m=+0.190124658 container init acf5f2eb61d24fcef76d2067b9156b2d1875c0c1b8909012e4f453d596c9bb08 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-0df777d6-b389-44bc-b166-8208ab926234, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 26 15:18:12 compute-1 podman[208473]: 2026-01-26 15:18:12.886013087 +0000 UTC m=+0.196934708 container start acf5f2eb61d24fcef76d2067b9156b2d1875c0c1b8909012e4f453d596c9bb08 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-0df777d6-b389-44bc-b166-8208ab926234, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 15:18:12 compute-1 nova_compute[183403]: 2026-01-26 15:18:12.906 183407 DEBUG nova.virt.libvirt.driver [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:18:12 compute-1 nova_compute[183403]: 2026-01-26 15:18:12.906 183407 DEBUG nova.virt.libvirt.driver [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:18:12 compute-1 nova_compute[183403]: 2026-01-26 15:18:12.907 183407 DEBUG nova.virt.libvirt.driver [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:18:12 compute-1 nova_compute[183403]: 2026-01-26 15:18:12.907 183407 DEBUG nova.virt.libvirt.driver [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:18:12 compute-1 nova_compute[183403]: 2026-01-26 15:18:12.908 183407 DEBUG nova.virt.libvirt.driver [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:18:12 compute-1 nova_compute[183403]: 2026-01-26 15:18:12.908 183407 DEBUG nova.virt.libvirt.driver [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:18:12 compute-1 neutron-haproxy-ovnmeta-0df777d6-b389-44bc-b166-8208ab926234[208488]: [NOTICE]   (208492) : New worker (208494) forked
Jan 26 15:18:12 compute-1 neutron-haproxy-ovnmeta-0df777d6-b389-44bc-b166-8208ab926234[208488]: [NOTICE]   (208492) : Loading success.
Jan 26 15:18:12 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:18:12.994 104930 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 15:18:13 compute-1 nova_compute[183403]: 2026-01-26 15:18:13.422 183407 INFO nova.compute.manager [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Took 8.23 seconds to spawn the instance on the hypervisor.
Jan 26 15:18:13 compute-1 nova_compute[183403]: 2026-01-26 15:18:13.423 183407 DEBUG nova.compute.manager [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Jan 26 15:18:13 compute-1 nova_compute[183403]: 2026-01-26 15:18:13.468 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:18:13 compute-1 nova_compute[183403]: 2026-01-26 15:18:13.957 183407 INFO nova.compute.manager [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Took 13.59 seconds to build instance.
Jan 26 15:18:13 compute-1 nova_compute[183403]: 2026-01-26 15:18:13.975 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:18:13 compute-1 nova_compute[183403]: 2026-01-26 15:18:13.976 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:18:13 compute-1 nova_compute[183403]: 2026-01-26 15:18:13.976 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:18:13 compute-1 nova_compute[183403]: 2026-01-26 15:18:13.976 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 15:18:14 compute-1 nova_compute[183403]: 2026-01-26 15:18:14.322 183407 DEBUG nova.compute.manager [req-ac6add86-cdb5-4a94-94b7-624241376809 req-bda163a9-bc8d-4a2c-b023-9e9a93615a77 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Received event network-vif-plugged-74d00ce8-8619-4c5a-a2f4-4018d57f1469 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:18:14 compute-1 nova_compute[183403]: 2026-01-26 15:18:14.323 183407 DEBUG oslo_concurrency.lockutils [req-ac6add86-cdb5-4a94-94b7-624241376809 req-bda163a9-bc8d-4a2c-b023-9e9a93615a77 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "c52ee407-1afb-4ae3-ae7f-792592e6badf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:18:14 compute-1 nova_compute[183403]: 2026-01-26 15:18:14.323 183407 DEBUG oslo_concurrency.lockutils [req-ac6add86-cdb5-4a94-94b7-624241376809 req-bda163a9-bc8d-4a2c-b023-9e9a93615a77 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "c52ee407-1afb-4ae3-ae7f-792592e6badf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:18:14 compute-1 nova_compute[183403]: 2026-01-26 15:18:14.323 183407 DEBUG oslo_concurrency.lockutils [req-ac6add86-cdb5-4a94-94b7-624241376809 req-bda163a9-bc8d-4a2c-b023-9e9a93615a77 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "c52ee407-1afb-4ae3-ae7f-792592e6badf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:18:14 compute-1 nova_compute[183403]: 2026-01-26 15:18:14.324 183407 DEBUG nova.compute.manager [req-ac6add86-cdb5-4a94-94b7-624241376809 req-bda163a9-bc8d-4a2c-b023-9e9a93615a77 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] No waiting events found dispatching network-vif-plugged-74d00ce8-8619-4c5a-a2f4-4018d57f1469 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:18:14 compute-1 nova_compute[183403]: 2026-01-26 15:18:14.324 183407 WARNING nova.compute.manager [req-ac6add86-cdb5-4a94-94b7-624241376809 req-bda163a9-bc8d-4a2c-b023-9e9a93615a77 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Received unexpected event network-vif-plugged-74d00ce8-8619-4c5a-a2f4-4018d57f1469 for instance with vm_state active and task_state None.
Jan 26 15:18:14 compute-1 nova_compute[183403]: 2026-01-26 15:18:14.458 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:18:14 compute-1 nova_compute[183403]: 2026-01-26 15:18:14.462 183407 DEBUG oslo_concurrency.lockutils [None req-0f6ae01a-9be5-46be-921e-33bc4032c83d eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Lock "c52ee407-1afb-4ae3-ae7f-792592e6badf" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.112s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:18:14 compute-1 nova_compute[183403]: 2026-01-26 15:18:14.825 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:18:14 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:18:14.995 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=41380b5a-e321-4ce4-bcc6-ecd563b3c793, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:18:15 compute-1 nova_compute[183403]: 2026-01-26 15:18:15.080 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:18:17 compute-1 nova_compute[183403]: 2026-01-26 15:18:17.575 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:18:19 compute-1 openstack_network_exporter[195610]: ERROR   15:18:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:18:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:18:19 compute-1 openstack_network_exporter[195610]: ERROR   15:18:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:18:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:18:19 compute-1 nova_compute[183403]: 2026-01-26 15:18:19.461 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:18:19 compute-1 nova_compute[183403]: 2026-01-26 15:18:19.827 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:18:24 compute-1 nova_compute[183403]: 2026-01-26 15:18:24.466 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:18:24 compute-1 nova_compute[183403]: 2026-01-26 15:18:24.829 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:18:25 compute-1 ovn_controller[95641]: 2026-01-26T15:18:25Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f6:ca:e0 10.100.0.11
Jan 26 15:18:25 compute-1 ovn_controller[95641]: 2026-01-26T15:18:25Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f6:ca:e0 10.100.0.11
Jan 26 15:18:28 compute-1 nova_compute[183403]: 2026-01-26 15:18:28.867 183407 DEBUG nova.virt.libvirt.driver [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8e369b01-79a6-4f8a-bf56-148d715aaea5] Creating tmpfile /var/lib/nova/instances/tmp2zdv3426 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Jan 26 15:18:28 compute-1 nova_compute[183403]: 2026-01-26 15:18:28.869 183407 WARNING neutronclient.v2_0.client [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:18:28 compute-1 nova_compute[183403]: 2026-01-26 15:18:28.881 183407 DEBUG nova.compute.manager [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp2zdv3426',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9090
Jan 26 15:18:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:18:29.048 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:18:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:18:29.049 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:18:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:18:29.050 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:18:29 compute-1 nova_compute[183403]: 2026-01-26 15:18:29.471 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:18:29 compute-1 nova_compute[183403]: 2026-01-26 15:18:29.882 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:18:30 compute-1 nova_compute[183403]: 2026-01-26 15:18:30.940 183407 WARNING neutronclient.v2_0.client [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:18:32 compute-1 podman[208531]: 2026-01-26 15:18:32.912373084 +0000 UTC m=+0.067852948 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 26 15:18:32 compute-1 podman[208532]: 2026-01-26 15:18:32.912419265 +0000 UTC m=+0.062605563 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, architecture=x86_64, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, vcs-type=git, name=ubi9-minimal, version=9.6, distribution-scope=public, vendor=Red Hat, Inc.)
Jan 26 15:18:34 compute-1 nova_compute[183403]: 2026-01-26 15:18:34.477 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:18:34 compute-1 nova_compute[183403]: 2026-01-26 15:18:34.885 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:18:35 compute-1 nova_compute[183403]: 2026-01-26 15:18:35.469 183407 DEBUG nova.compute.manager [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp2zdv3426',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='8e369b01-79a6-4f8a-bf56-148d715aaea5',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9315
Jan 26 15:18:35 compute-1 podman[192725]: time="2026-01-26T15:18:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:18:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:18:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16575 "" "Go-http-client/1.1"
Jan 26 15:18:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:18:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2652 "" "Go-http-client/1.1"
Jan 26 15:18:36 compute-1 nova_compute[183403]: 2026-01-26 15:18:36.577 183407 DEBUG oslo_concurrency.lockutils [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "refresh_cache-8e369b01-79a6-4f8a-bf56-148d715aaea5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 15:18:36 compute-1 nova_compute[183403]: 2026-01-26 15:18:36.578 183407 DEBUG oslo_concurrency.lockutils [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquired lock "refresh_cache-8e369b01-79a6-4f8a-bf56-148d715aaea5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 15:18:36 compute-1 nova_compute[183403]: 2026-01-26 15:18:36.578 183407 DEBUG nova.network.neutron [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8e369b01-79a6-4f8a-bf56-148d715aaea5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 15:18:37 compute-1 nova_compute[183403]: 2026-01-26 15:18:37.085 183407 WARNING neutronclient.v2_0.client [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:18:37 compute-1 nova_compute[183403]: 2026-01-26 15:18:37.669 183407 WARNING neutronclient.v2_0.client [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:18:37 compute-1 nova_compute[183403]: 2026-01-26 15:18:37.884 183407 DEBUG nova.network.neutron [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8e369b01-79a6-4f8a-bf56-148d715aaea5] Updating instance_info_cache with network_info: [{"id": "3504cf38-e909-4755-aa5c-6de44f9944b0", "address": "fa:16:3e:d3:34:ef", "network": {"id": "0df777d6-b389-44bc-b166-8208ab926234", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1259048565-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2d0174cdadae4803981796a2cea457d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3504cf38-e9", "ovs_interfaceid": "3504cf38-e909-4755-aa5c-6de44f9944b0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:18:38 compute-1 nova_compute[183403]: 2026-01-26 15:18:38.657 183407 DEBUG oslo_concurrency.lockutils [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Releasing lock "refresh_cache-8e369b01-79a6-4f8a-bf56-148d715aaea5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 15:18:38 compute-1 nova_compute[183403]: 2026-01-26 15:18:38.672 183407 DEBUG nova.virt.libvirt.driver [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8e369b01-79a6-4f8a-bf56-148d715aaea5] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp2zdv3426',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='8e369b01-79a6-4f8a-bf56-148d715aaea5',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Jan 26 15:18:38 compute-1 nova_compute[183403]: 2026-01-26 15:18:38.673 183407 DEBUG nova.virt.libvirt.driver [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8e369b01-79a6-4f8a-bf56-148d715aaea5] Creating instance directory: /var/lib/nova/instances/8e369b01-79a6-4f8a-bf56-148d715aaea5 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Jan 26 15:18:38 compute-1 nova_compute[183403]: 2026-01-26 15:18:38.673 183407 DEBUG nova.virt.libvirt.driver [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8e369b01-79a6-4f8a-bf56-148d715aaea5] Creating disk.info with the contents: {'/var/lib/nova/instances/8e369b01-79a6-4f8a-bf56-148d715aaea5/disk': 'qcow2', '/var/lib/nova/instances/8e369b01-79a6-4f8a-bf56-148d715aaea5/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Jan 26 15:18:38 compute-1 nova_compute[183403]: 2026-01-26 15:18:38.674 183407 DEBUG nova.virt.libvirt.driver [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8e369b01-79a6-4f8a-bf56-148d715aaea5] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Jan 26 15:18:38 compute-1 nova_compute[183403]: 2026-01-26 15:18:38.674 183407 DEBUG nova.objects.instance [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lazy-loading 'trusted_certs' on Instance uuid 8e369b01-79a6-4f8a-bf56-148d715aaea5 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:18:39 compute-1 nova_compute[183403]: 2026-01-26 15:18:39.179 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:18:39 compute-1 nova_compute[183403]: 2026-01-26 15:18:39.182 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:18:39 compute-1 nova_compute[183403]: 2026-01-26 15:18:39.185 183407 DEBUG oslo_concurrency.processutils [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:18:39 compute-1 nova_compute[183403]: 2026-01-26 15:18:39.271 183407 DEBUG oslo_concurrency.processutils [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:18:39 compute-1 nova_compute[183403]: 2026-01-26 15:18:39.272 183407 DEBUG oslo_concurrency.lockutils [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:18:39 compute-1 nova_compute[183403]: 2026-01-26 15:18:39.273 183407 DEBUG oslo_concurrency.lockutils [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:18:39 compute-1 nova_compute[183403]: 2026-01-26 15:18:39.274 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:18:39 compute-1 nova_compute[183403]: 2026-01-26 15:18:39.278 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:18:39 compute-1 nova_compute[183403]: 2026-01-26 15:18:39.279 183407 DEBUG oslo_concurrency.processutils [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:18:39 compute-1 nova_compute[183403]: 2026-01-26 15:18:39.359 183407 DEBUG oslo_concurrency.processutils [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:18:39 compute-1 nova_compute[183403]: 2026-01-26 15:18:39.360 183407 DEBUG oslo_concurrency.processutils [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0,backing_fmt=raw /var/lib/nova/instances/8e369b01-79a6-4f8a-bf56-148d715aaea5/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:18:39 compute-1 nova_compute[183403]: 2026-01-26 15:18:39.399 183407 DEBUG oslo_concurrency.processutils [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0,backing_fmt=raw /var/lib/nova/instances/8e369b01-79a6-4f8a-bf56-148d715aaea5/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:18:39 compute-1 nova_compute[183403]: 2026-01-26 15:18:39.401 183407 DEBUG oslo_concurrency.lockutils [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.127s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:18:39 compute-1 nova_compute[183403]: 2026-01-26 15:18:39.401 183407 DEBUG oslo_concurrency.processutils [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:18:39 compute-1 nova_compute[183403]: 2026-01-26 15:18:39.452 183407 DEBUG oslo_concurrency.processutils [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:18:39 compute-1 nova_compute[183403]: 2026-01-26 15:18:39.453 183407 DEBUG nova.virt.disk.api [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Checking if we can resize image /var/lib/nova/instances/8e369b01-79a6-4f8a-bf56-148d715aaea5/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 26 15:18:39 compute-1 nova_compute[183403]: 2026-01-26 15:18:39.454 183407 DEBUG oslo_concurrency.processutils [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8e369b01-79a6-4f8a-bf56-148d715aaea5/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:18:39 compute-1 nova_compute[183403]: 2026-01-26 15:18:39.481 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:18:39 compute-1 nova_compute[183403]: 2026-01-26 15:18:39.504 183407 DEBUG oslo_concurrency.processutils [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8e369b01-79a6-4f8a-bf56-148d715aaea5/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:18:39 compute-1 nova_compute[183403]: 2026-01-26 15:18:39.505 183407 DEBUG nova.virt.disk.api [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Cannot resize image /var/lib/nova/instances/8e369b01-79a6-4f8a-bf56-148d715aaea5/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 26 15:18:39 compute-1 nova_compute[183403]: 2026-01-26 15:18:39.506 183407 DEBUG nova.objects.instance [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lazy-loading 'migration_context' on Instance uuid 8e369b01-79a6-4f8a-bf56-148d715aaea5 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:18:39 compute-1 nova_compute[183403]: 2026-01-26 15:18:39.888 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:18:40 compute-1 nova_compute[183403]: 2026-01-26 15:18:40.016 183407 DEBUG nova.objects.base [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Object Instance<8e369b01-79a6-4f8a-bf56-148d715aaea5> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Jan 26 15:18:40 compute-1 nova_compute[183403]: 2026-01-26 15:18:40.017 183407 DEBUG oslo_concurrency.processutils [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/8e369b01-79a6-4f8a-bf56-148d715aaea5/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:18:40 compute-1 nova_compute[183403]: 2026-01-26 15:18:40.055 183407 DEBUG oslo_concurrency.processutils [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/8e369b01-79a6-4f8a-bf56-148d715aaea5/disk.config 497664" returned: 0 in 0.038s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:18:40 compute-1 nova_compute[183403]: 2026-01-26 15:18:40.057 183407 DEBUG nova.virt.libvirt.driver [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8e369b01-79a6-4f8a-bf56-148d715aaea5] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Jan 26 15:18:40 compute-1 nova_compute[183403]: 2026-01-26 15:18:40.059 183407 DEBUG nova.virt.libvirt.vif [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-26T15:17:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-794224949',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-794224949',id=12,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T15:17:54Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3ed11f66f0de4f6191def09f65c67624',ramdisk_id='',reservation_id='r-dc35661j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1844876463',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1844876463-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T15:17:55Z,user_data=None,user_id='eabb3af6e41e4d9e883fc43bd03679db',uuid=8e369b01-79a6-4f8a-bf56-148d715aaea5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3504cf38-e909-4755-aa5c-6de44f9944b0", "address": "fa:16:3e:d3:34:ef", "network": {"id": "0df777d6-b389-44bc-b166-8208ab926234", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1259048565-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2d0174cdadae4803981796a2cea457d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap3504cf38-e9", "ovs_interfaceid": "3504cf38-e909-4755-aa5c-6de44f9944b0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 26 15:18:40 compute-1 nova_compute[183403]: 2026-01-26 15:18:40.060 183407 DEBUG nova.network.os_vif_util [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Converting VIF {"id": "3504cf38-e909-4755-aa5c-6de44f9944b0", "address": "fa:16:3e:d3:34:ef", "network": {"id": "0df777d6-b389-44bc-b166-8208ab926234", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1259048565-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2d0174cdadae4803981796a2cea457d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap3504cf38-e9", "ovs_interfaceid": "3504cf38-e909-4755-aa5c-6de44f9944b0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:18:40 compute-1 nova_compute[183403]: 2026-01-26 15:18:40.061 183407 DEBUG nova.network.os_vif_util [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d3:34:ef,bridge_name='br-int',has_traffic_filtering=True,id=3504cf38-e909-4755-aa5c-6de44f9944b0,network=Network(0df777d6-b389-44bc-b166-8208ab926234),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3504cf38-e9') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:18:40 compute-1 nova_compute[183403]: 2026-01-26 15:18:40.062 183407 DEBUG os_vif [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:34:ef,bridge_name='br-int',has_traffic_filtering=True,id=3504cf38-e909-4755-aa5c-6de44f9944b0,network=Network(0df777d6-b389-44bc-b166-8208ab926234),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3504cf38-e9') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 26 15:18:40 compute-1 nova_compute[183403]: 2026-01-26 15:18:40.063 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:18:40 compute-1 nova_compute[183403]: 2026-01-26 15:18:40.064 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:18:40 compute-1 nova_compute[183403]: 2026-01-26 15:18:40.065 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:18:40 compute-1 nova_compute[183403]: 2026-01-26 15:18:40.066 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:18:40 compute-1 nova_compute[183403]: 2026-01-26 15:18:40.067 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '09c57756-75b7-5faf-8aae-47f927d0c94c', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:18:40 compute-1 nova_compute[183403]: 2026-01-26 15:18:40.068 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:18:40 compute-1 nova_compute[183403]: 2026-01-26 15:18:40.071 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:18:40 compute-1 nova_compute[183403]: 2026-01-26 15:18:40.075 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:18:40 compute-1 nova_compute[183403]: 2026-01-26 15:18:40.075 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3504cf38-e9, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:18:40 compute-1 nova_compute[183403]: 2026-01-26 15:18:40.076 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap3504cf38-e9, col_values=(('qos', UUID('9e1ae6db-d9d0-4454-9129-4ff5130fc60f')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:18:40 compute-1 nova_compute[183403]: 2026-01-26 15:18:40.076 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap3504cf38-e9, col_values=(('external_ids', {'iface-id': '3504cf38-e909-4755-aa5c-6de44f9944b0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d3:34:ef', 'vm-uuid': '8e369b01-79a6-4f8a-bf56-148d715aaea5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:18:40 compute-1 nova_compute[183403]: 2026-01-26 15:18:40.077 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:18:40 compute-1 NetworkManager[55716]: <info>  [1769440720.0791] manager: (tap3504cf38-e9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/47)
Jan 26 15:18:40 compute-1 nova_compute[183403]: 2026-01-26 15:18:40.079 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:18:40 compute-1 nova_compute[183403]: 2026-01-26 15:18:40.086 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:18:40 compute-1 nova_compute[183403]: 2026-01-26 15:18:40.087 183407 INFO os_vif [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:34:ef,bridge_name='br-int',has_traffic_filtering=True,id=3504cf38-e909-4755-aa5c-6de44f9944b0,network=Network(0df777d6-b389-44bc-b166-8208ab926234),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3504cf38-e9')
Jan 26 15:18:40 compute-1 nova_compute[183403]: 2026-01-26 15:18:40.087 183407 DEBUG nova.virt.libvirt.driver [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Jan 26 15:18:40 compute-1 nova_compute[183403]: 2026-01-26 15:18:40.088 183407 DEBUG nova.compute.manager [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp2zdv3426',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='8e369b01-79a6-4f8a-bf56-148d715aaea5',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9381
Jan 26 15:18:40 compute-1 nova_compute[183403]: 2026-01-26 15:18:40.088 183407 WARNING neutronclient.v2_0.client [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:18:40 compute-1 nova_compute[183403]: 2026-01-26 15:18:40.368 183407 WARNING neutronclient.v2_0.client [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:18:41 compute-1 nova_compute[183403]: 2026-01-26 15:18:41.409 183407 DEBUG nova.network.neutron [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8e369b01-79a6-4f8a-bf56-148d715aaea5] Port 3504cf38-e909-4755-aa5c-6de44f9944b0 updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Jan 26 15:18:41 compute-1 nova_compute[183403]: 2026-01-26 15:18:41.426 183407 DEBUG nova.compute.manager [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp2zdv3426',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='8e369b01-79a6-4f8a-bf56-148d715aaea5',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9447
Jan 26 15:18:41 compute-1 podman[208597]: 2026-01-26 15:18:41.878321995 +0000 UTC m=+0.053443210 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 26 15:18:41 compute-1 podman[208596]: 2026-01-26 15:18:41.944480665 +0000 UTC m=+0.118494469 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, config_id=ovn_controller)
Jan 26 15:18:42 compute-1 ovn_controller[95641]: 2026-01-26T15:18:42Z|00111|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 26 15:18:44 compute-1 systemd[1]: Starting libvirt proxy daemon...
Jan 26 15:18:44 compute-1 systemd[1]: Started libvirt proxy daemon.
Jan 26 15:18:44 compute-1 kernel: tap3504cf38-e9: entered promiscuous mode
Jan 26 15:18:44 compute-1 NetworkManager[55716]: <info>  [1769440724.7227] manager: (tap3504cf38-e9): new Tun device (/org/freedesktop/NetworkManager/Devices/48)
Jan 26 15:18:44 compute-1 nova_compute[183403]: 2026-01-26 15:18:44.724 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:18:44 compute-1 ovn_controller[95641]: 2026-01-26T15:18:44Z|00112|binding|INFO|Claiming lport 3504cf38-e909-4755-aa5c-6de44f9944b0 for this additional chassis.
Jan 26 15:18:44 compute-1 ovn_controller[95641]: 2026-01-26T15:18:44Z|00113|binding|INFO|3504cf38-e909-4755-aa5c-6de44f9944b0: Claiming fa:16:3e:d3:34:ef 10.100.0.4
Jan 26 15:18:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:18:44.734 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d3:34:ef 10.100.0.4'], port_security=['fa:16:3e:d3:34:ef 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '8e369b01-79a6-4f8a-bf56-148d715aaea5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0df777d6-b389-44bc-b166-8208ab926234', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ed11f66f0de4f6191def09f65c67624', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'e54c8a16-0c91-4c9d-aa33-a10f9396fada', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ffea2f62-0986-47bd-a80c-89f1c9decf3f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=3504cf38-e909-4755-aa5c-6de44f9944b0) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:18:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:18:44.735 104930 INFO neutron.agent.ovn.metadata.agent [-] Port 3504cf38-e909-4755-aa5c-6de44f9944b0 in datapath 0df777d6-b389-44bc-b166-8208ab926234 unbound from our chassis
Jan 26 15:18:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:18:44.737 104930 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0df777d6-b389-44bc-b166-8208ab926234
Jan 26 15:18:44 compute-1 ovn_controller[95641]: 2026-01-26T15:18:44Z|00114|binding|INFO|Setting lport 3504cf38-e909-4755-aa5c-6de44f9944b0 ovn-installed in OVS
Jan 26 15:18:44 compute-1 nova_compute[183403]: 2026-01-26 15:18:44.744 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:18:44 compute-1 nova_compute[183403]: 2026-01-26 15:18:44.746 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:18:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:18:44.755 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[c055ece8-e47f-4834-9c69-235e93b71a97]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:18:44 compute-1 systemd-udevd[208676]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 15:18:44 compute-1 systemd-machined[154697]: New machine qemu-10-instance-0000000c.
Jan 26 15:18:44 compute-1 NetworkManager[55716]: <info>  [1769440724.7763] device (tap3504cf38-e9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 15:18:44 compute-1 NetworkManager[55716]: <info>  [1769440724.7772] device (tap3504cf38-e9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 15:18:44 compute-1 systemd[1]: Started Virtual Machine qemu-10-instance-0000000c.
Jan 26 15:18:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:18:44.785 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[37b9d722-a52d-49c5-9be8-892a0c5180a7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:18:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:18:44.787 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[2b01aea9-2a88-4917-940e-79a83160b4b2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:18:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:18:44.813 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[139b39df-2658-4adb-92aa-c112a779217b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:18:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:18:44.829 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[2243290b-b56d-4ae7-bc7f-3596faf36140]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0df777d6-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:09:dc:c5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 31], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 441475, 'reachable_time': 40452, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 208682, 'error': None, 'target': 'ovnmeta-0df777d6-b389-44bc-b166-8208ab926234', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:18:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:18:44.843 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[30008629-2c3e-4de3-a6d8-c3dff67f7cac]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap0df777d6-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 441486, 'tstamp': 441486}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 208688, 'error': None, 'target': 'ovnmeta-0df777d6-b389-44bc-b166-8208ab926234', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap0df777d6-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 441490, 'tstamp': 441490}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 208688, 'error': None, 'target': 'ovnmeta-0df777d6-b389-44bc-b166-8208ab926234', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:18:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:18:44.844 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0df777d6-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:18:44 compute-1 nova_compute[183403]: 2026-01-26 15:18:44.845 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:18:44 compute-1 nova_compute[183403]: 2026-01-26 15:18:44.847 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:18:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:18:44.847 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0df777d6-b0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:18:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:18:44.847 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:18:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:18:44.848 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0df777d6-b0, col_values=(('external_ids', {'iface-id': '33751b84-abc7-465c-a58b-58ca2b0cbc0a'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:18:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:18:44.848 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:18:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:18:44.850 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[47e1406c-4065-4262-bd14-b67a41ea0a72]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-0df777d6-b389-44bc-b166-8208ab926234\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/0df777d6-b389-44bc-b166-8208ab926234.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 0df777d6-b389-44bc-b166-8208ab926234\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:18:44 compute-1 nova_compute[183403]: 2026-01-26 15:18:44.890 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:18:45 compute-1 nova_compute[183403]: 2026-01-26 15:18:45.077 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:18:47 compute-1 ovn_controller[95641]: 2026-01-26T15:18:47Z|00115|binding|INFO|Claiming lport 3504cf38-e909-4755-aa5c-6de44f9944b0 for this chassis.
Jan 26 15:18:47 compute-1 ovn_controller[95641]: 2026-01-26T15:18:47Z|00116|binding|INFO|3504cf38-e909-4755-aa5c-6de44f9944b0: Claiming fa:16:3e:d3:34:ef 10.100.0.4
Jan 26 15:18:47 compute-1 ovn_controller[95641]: 2026-01-26T15:18:47Z|00117|binding|INFO|Setting lport 3504cf38-e909-4755-aa5c-6de44f9944b0 up in Southbound
Jan 26 15:18:48 compute-1 nova_compute[183403]: 2026-01-26 15:18:48.876 183407 INFO nova.compute.manager [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8e369b01-79a6-4f8a-bf56-148d715aaea5] Post operation of migration started
Jan 26 15:18:48 compute-1 nova_compute[183403]: 2026-01-26 15:18:48.877 183407 WARNING neutronclient.v2_0.client [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:18:48 compute-1 nova_compute[183403]: 2026-01-26 15:18:48.990 183407 WARNING neutronclient.v2_0.client [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:18:48 compute-1 nova_compute[183403]: 2026-01-26 15:18:48.991 183407 WARNING neutronclient.v2_0.client [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:18:49 compute-1 nova_compute[183403]: 2026-01-26 15:18:49.081 183407 DEBUG oslo_concurrency.lockutils [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "refresh_cache-8e369b01-79a6-4f8a-bf56-148d715aaea5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 15:18:49 compute-1 nova_compute[183403]: 2026-01-26 15:18:49.082 183407 DEBUG oslo_concurrency.lockutils [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquired lock "refresh_cache-8e369b01-79a6-4f8a-bf56-148d715aaea5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 15:18:49 compute-1 nova_compute[183403]: 2026-01-26 15:18:49.082 183407 DEBUG nova.network.neutron [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8e369b01-79a6-4f8a-bf56-148d715aaea5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 15:18:49 compute-1 openstack_network_exporter[195610]: ERROR   15:18:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:18:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:18:49 compute-1 openstack_network_exporter[195610]: ERROR   15:18:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:18:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:18:49 compute-1 nova_compute[183403]: 2026-01-26 15:18:49.592 183407 WARNING neutronclient.v2_0.client [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:18:49 compute-1 nova_compute[183403]: 2026-01-26 15:18:49.892 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:18:50 compute-1 nova_compute[183403]: 2026-01-26 15:18:50.117 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:18:51 compute-1 nova_compute[183403]: 2026-01-26 15:18:51.380 183407 WARNING neutronclient.v2_0.client [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:18:51 compute-1 nova_compute[183403]: 2026-01-26 15:18:51.575 183407 DEBUG nova.network.neutron [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8e369b01-79a6-4f8a-bf56-148d715aaea5] Updating instance_info_cache with network_info: [{"id": "3504cf38-e909-4755-aa5c-6de44f9944b0", "address": "fa:16:3e:d3:34:ef", "network": {"id": "0df777d6-b389-44bc-b166-8208ab926234", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1259048565-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2d0174cdadae4803981796a2cea457d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3504cf38-e9", "ovs_interfaceid": "3504cf38-e909-4755-aa5c-6de44f9944b0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:18:52 compute-1 nova_compute[183403]: 2026-01-26 15:18:52.082 183407 DEBUG oslo_concurrency.lockutils [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Releasing lock "refresh_cache-8e369b01-79a6-4f8a-bf56-148d715aaea5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 15:18:52 compute-1 nova_compute[183403]: 2026-01-26 15:18:52.614 183407 DEBUG oslo_concurrency.lockutils [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:18:52 compute-1 nova_compute[183403]: 2026-01-26 15:18:52.615 183407 DEBUG oslo_concurrency.lockutils [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:18:52 compute-1 nova_compute[183403]: 2026-01-26 15:18:52.615 183407 DEBUG oslo_concurrency.lockutils [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:18:52 compute-1 nova_compute[183403]: 2026-01-26 15:18:52.619 183407 INFO nova.virt.libvirt.driver [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8e369b01-79a6-4f8a-bf56-148d715aaea5] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Jan 26 15:18:52 compute-1 virtqemud[183290]: Domain id=10 name='instance-0000000c' uuid=8e369b01-79a6-4f8a-bf56-148d715aaea5 is tainted: custom-monitor
Jan 26 15:18:53 compute-1 nova_compute[183403]: 2026-01-26 15:18:53.628 183407 INFO nova.virt.libvirt.driver [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8e369b01-79a6-4f8a-bf56-148d715aaea5] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Jan 26 15:18:54 compute-1 nova_compute[183403]: 2026-01-26 15:18:54.636 183407 INFO nova.virt.libvirt.driver [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8e369b01-79a6-4f8a-bf56-148d715aaea5] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Jan 26 15:18:54 compute-1 nova_compute[183403]: 2026-01-26 15:18:54.644 183407 DEBUG nova.compute.manager [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8e369b01-79a6-4f8a-bf56-148d715aaea5] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Jan 26 15:18:54 compute-1 nova_compute[183403]: 2026-01-26 15:18:54.895 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:18:55 compute-1 nova_compute[183403]: 2026-01-26 15:18:55.118 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:18:55 compute-1 nova_compute[183403]: 2026-01-26 15:18:55.157 183407 DEBUG nova.objects.instance [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8e369b01-79a6-4f8a-bf56-148d715aaea5] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Jan 26 15:18:56 compute-1 nova_compute[183403]: 2026-01-26 15:18:56.177 183407 WARNING neutronclient.v2_0.client [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:18:56 compute-1 nova_compute[183403]: 2026-01-26 15:18:56.376 183407 WARNING neutronclient.v2_0.client [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:18:56 compute-1 nova_compute[183403]: 2026-01-26 15:18:56.377 183407 WARNING neutronclient.v2_0.client [None req-a787da4d-646d-4741-a3d6-c02eb8806d8c a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:18:59 compute-1 nova_compute[183403]: 2026-01-26 15:18:59.896 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:19:00 compute-1 nova_compute[183403]: 2026-01-26 15:19:00.121 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:19:03 compute-1 podman[208708]: 2026-01-26 15:19:03.89667697 +0000 UTC m=+0.074073929 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 15:19:03 compute-1 podman[208709]: 2026-01-26 15:19:03.900531537 +0000 UTC m=+0.074422309 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, distribution-scope=public, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, release=1755695350, io.openshift.expose-services=)
Jan 26 15:19:04 compute-1 nova_compute[183403]: 2026-01-26 15:19:04.897 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:19:05 compute-1 nova_compute[183403]: 2026-01-26 15:19:05.179 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:19:05 compute-1 podman[192725]: time="2026-01-26T15:19:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:19:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:19:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16575 "" "Go-http-client/1.1"
Jan 26 15:19:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:19:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2648 "" "Go-http-client/1.1"
Jan 26 15:19:07 compute-1 nova_compute[183403]: 2026-01-26 15:19:07.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:19:08 compute-1 nova_compute[183403]: 2026-01-26 15:19:08.075 183407 DEBUG oslo_concurrency.lockutils [None req-2954a21b-0e9e-41ad-b62a-b124a811b049 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Acquiring lock "c52ee407-1afb-4ae3-ae7f-792592e6badf" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:19:08 compute-1 nova_compute[183403]: 2026-01-26 15:19:08.076 183407 DEBUG oslo_concurrency.lockutils [None req-2954a21b-0e9e-41ad-b62a-b124a811b049 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Lock "c52ee407-1afb-4ae3-ae7f-792592e6badf" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:19:08 compute-1 nova_compute[183403]: 2026-01-26 15:19:08.076 183407 DEBUG oslo_concurrency.lockutils [None req-2954a21b-0e9e-41ad-b62a-b124a811b049 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Acquiring lock "c52ee407-1afb-4ae3-ae7f-792592e6badf-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:19:08 compute-1 nova_compute[183403]: 2026-01-26 15:19:08.076 183407 DEBUG oslo_concurrency.lockutils [None req-2954a21b-0e9e-41ad-b62a-b124a811b049 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Lock "c52ee407-1afb-4ae3-ae7f-792592e6badf-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:19:08 compute-1 nova_compute[183403]: 2026-01-26 15:19:08.077 183407 DEBUG oslo_concurrency.lockutils [None req-2954a21b-0e9e-41ad-b62a-b124a811b049 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Lock "c52ee407-1afb-4ae3-ae7f-792592e6badf-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:19:08 compute-1 nova_compute[183403]: 2026-01-26 15:19:08.115 183407 INFO nova.compute.manager [None req-2954a21b-0e9e-41ad-b62a-b124a811b049 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Terminating instance
Jan 26 15:19:08 compute-1 nova_compute[183403]: 2026-01-26 15:19:08.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:19:08 compute-1 nova_compute[183403]: 2026-01-26 15:19:08.632 183407 DEBUG nova.compute.manager [None req-2954a21b-0e9e-41ad-b62a-b124a811b049 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Jan 26 15:19:08 compute-1 kernel: tap74d00ce8-86 (unregistering): left promiscuous mode
Jan 26 15:19:08 compute-1 NetworkManager[55716]: <info>  [1769440748.6647] device (tap74d00ce8-86): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 15:19:08 compute-1 ovn_controller[95641]: 2026-01-26T15:19:08Z|00118|binding|INFO|Releasing lport 74d00ce8-8619-4c5a-a2f4-4018d57f1469 from this chassis (sb_readonly=0)
Jan 26 15:19:08 compute-1 ovn_controller[95641]: 2026-01-26T15:19:08Z|00119|binding|INFO|Setting lport 74d00ce8-8619-4c5a-a2f4-4018d57f1469 down in Southbound
Jan 26 15:19:08 compute-1 nova_compute[183403]: 2026-01-26 15:19:08.678 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:19:08 compute-1 ovn_controller[95641]: 2026-01-26T15:19:08Z|00120|binding|INFO|Removing iface tap74d00ce8-86 ovn-installed in OVS
Jan 26 15:19:08 compute-1 nova_compute[183403]: 2026-01-26 15:19:08.680 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:19:08 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:19:08.684 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f6:ca:e0 10.100.0.11'], port_security=['fa:16:3e:f6:ca:e0 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'c52ee407-1afb-4ae3-ae7f-792592e6badf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0df777d6-b389-44bc-b166-8208ab926234', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ed11f66f0de4f6191def09f65c67624', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'e54c8a16-0c91-4c9d-aa33-a10f9396fada', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ffea2f62-0986-47bd-a80c-89f1c9decf3f, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], logical_port=74d00ce8-8619-4c5a-a2f4-4018d57f1469) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:19:08 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:19:08.685 104930 INFO neutron.agent.ovn.metadata.agent [-] Port 74d00ce8-8619-4c5a-a2f4-4018d57f1469 in datapath 0df777d6-b389-44bc-b166-8208ab926234 unbound from our chassis
Jan 26 15:19:08 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:19:08.686 104930 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0df777d6-b389-44bc-b166-8208ab926234
Jan 26 15:19:08 compute-1 nova_compute[183403]: 2026-01-26 15:19:08.693 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:19:08 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:19:08.709 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[6805650c-8562-46c5-bf1c-230ad9c956db]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:19:08 compute-1 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000000d.scope: Deactivated successfully.
Jan 26 15:19:08 compute-1 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000000d.scope: Consumed 14.958s CPU time.
Jan 26 15:19:08 compute-1 systemd-machined[154697]: Machine qemu-9-instance-0000000d terminated.
Jan 26 15:19:08 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:19:08.736 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[b00e6e84-44dc-43f0-83f2-b355870ff210]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:19:08 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:19:08.739 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[025bbf38-d86e-4d27-ac0c-2f777fbf6076]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:19:08 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:19:08.770 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[1cf34926-40f5-4e03-9aba-576fe6de4933]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:19:08 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:19:08.789 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[68ddf81c-5e58-4ddf-9a99-5702b3afc5c8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0df777d6-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:09:dc:c5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 31], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 441475, 'reachable_time': 40452, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 208767, 'error': None, 'target': 'ovnmeta-0df777d6-b389-44bc-b166-8208ab926234', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:19:08 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:19:08.814 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[cbc3fc2f-4d46-4d7d-bea9-365eadf75c6e]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap0df777d6-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 441486, 'tstamp': 441486}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 208768, 'error': None, 'target': 'ovnmeta-0df777d6-b389-44bc-b166-8208ab926234', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap0df777d6-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 441490, 'tstamp': 441490}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 208768, 'error': None, 'target': 'ovnmeta-0df777d6-b389-44bc-b166-8208ab926234', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:19:08 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:19:08.816 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0df777d6-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:19:08 compute-1 nova_compute[183403]: 2026-01-26 15:19:08.818 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:19:08 compute-1 nova_compute[183403]: 2026-01-26 15:19:08.825 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:19:08 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:19:08.825 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0df777d6-b0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:19:08 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:19:08.826 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:19:08 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:19:08.826 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0df777d6-b0, col_values=(('external_ids', {'iface-id': '33751b84-abc7-465c-a58b-58ca2b0cbc0a'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:19:08 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:19:08.826 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:19:08 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:19:08.827 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[7707f0eb-f8d6-4274-9b9f-06deb11aa353]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-0df777d6-b389-44bc-b166-8208ab926234\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/0df777d6-b389-44bc-b166-8208ab926234.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 0df777d6-b389-44bc-b166-8208ab926234\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:19:08 compute-1 nova_compute[183403]: 2026-01-26 15:19:08.857 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:19:08 compute-1 nova_compute[183403]: 2026-01-26 15:19:08.861 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:19:08 compute-1 nova_compute[183403]: 2026-01-26 15:19:08.903 183407 INFO nova.virt.libvirt.driver [-] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Instance destroyed successfully.
Jan 26 15:19:08 compute-1 nova_compute[183403]: 2026-01-26 15:19:08.904 183407 DEBUG nova.objects.instance [None req-2954a21b-0e9e-41ad-b62a-b124a811b049 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Lazy-loading 'resources' on Instance uuid c52ee407-1afb-4ae3-ae7f-792592e6badf obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:19:09 compute-1 nova_compute[183403]: 2026-01-26 15:19:09.097 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:19:09 compute-1 nova_compute[183403]: 2026-01-26 15:19:09.098 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:19:09 compute-1 nova_compute[183403]: 2026-01-26 15:19:09.098 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:19:09 compute-1 nova_compute[183403]: 2026-01-26 15:19:09.098 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:19:09 compute-1 nova_compute[183403]: 2026-01-26 15:19:09.411 183407 DEBUG nova.virt.libvirt.vif [None req-2954a21b-0e9e-41ad-b62a-b124a811b049 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2026-01-26T15:17:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1462342914',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1462342914',id=13,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T15:18:13Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3ed11f66f0de4f6191def09f65c67624',ramdisk_id='',reservation_id='r-eo9xbxqw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1844876463',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1844876463-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T15:18:13Z,user_data=None,user_id='eabb3af6e41e4d9e883fc43bd03679db',uuid=c52ee407-1afb-4ae3-ae7f-792592e6badf,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "74d00ce8-8619-4c5a-a2f4-4018d57f1469", "address": "fa:16:3e:f6:ca:e0", "network": {"id": "0df777d6-b389-44bc-b166-8208ab926234", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1259048565-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2d0174cdadae4803981796a2cea457d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74d00ce8-86", "ovs_interfaceid": "74d00ce8-8619-4c5a-a2f4-4018d57f1469", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 26 15:19:09 compute-1 nova_compute[183403]: 2026-01-26 15:19:09.411 183407 DEBUG nova.network.os_vif_util [None req-2954a21b-0e9e-41ad-b62a-b124a811b049 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Converting VIF {"id": "74d00ce8-8619-4c5a-a2f4-4018d57f1469", "address": "fa:16:3e:f6:ca:e0", "network": {"id": "0df777d6-b389-44bc-b166-8208ab926234", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1259048565-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2d0174cdadae4803981796a2cea457d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74d00ce8-86", "ovs_interfaceid": "74d00ce8-8619-4c5a-a2f4-4018d57f1469", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:19:09 compute-1 nova_compute[183403]: 2026-01-26 15:19:09.412 183407 DEBUG nova.network.os_vif_util [None req-2954a21b-0e9e-41ad-b62a-b124a811b049 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f6:ca:e0,bridge_name='br-int',has_traffic_filtering=True,id=74d00ce8-8619-4c5a-a2f4-4018d57f1469,network=Network(0df777d6-b389-44bc-b166-8208ab926234),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap74d00ce8-86') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:19:09 compute-1 nova_compute[183403]: 2026-01-26 15:19:09.412 183407 DEBUG os_vif [None req-2954a21b-0e9e-41ad-b62a-b124a811b049 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f6:ca:e0,bridge_name='br-int',has_traffic_filtering=True,id=74d00ce8-8619-4c5a-a2f4-4018d57f1469,network=Network(0df777d6-b389-44bc-b166-8208ab926234),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap74d00ce8-86') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 26 15:19:09 compute-1 nova_compute[183403]: 2026-01-26 15:19:09.413 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:19:09 compute-1 nova_compute[183403]: 2026-01-26 15:19:09.414 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap74d00ce8-86, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:19:09 compute-1 nova_compute[183403]: 2026-01-26 15:19:09.415 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:19:09 compute-1 nova_compute[183403]: 2026-01-26 15:19:09.416 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:19:09 compute-1 nova_compute[183403]: 2026-01-26 15:19:09.417 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:19:09 compute-1 nova_compute[183403]: 2026-01-26 15:19:09.418 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=254ebe47-0260-4575-83dc-8097159bd14e) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:19:09 compute-1 nova_compute[183403]: 2026-01-26 15:19:09.418 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:19:09 compute-1 nova_compute[183403]: 2026-01-26 15:19:09.419 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:19:09 compute-1 nova_compute[183403]: 2026-01-26 15:19:09.423 183407 INFO os_vif [None req-2954a21b-0e9e-41ad-b62a-b124a811b049 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f6:ca:e0,bridge_name='br-int',has_traffic_filtering=True,id=74d00ce8-8619-4c5a-a2f4-4018d57f1469,network=Network(0df777d6-b389-44bc-b166-8208ab926234),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap74d00ce8-86')
Jan 26 15:19:09 compute-1 nova_compute[183403]: 2026-01-26 15:19:09.424 183407 INFO nova.virt.libvirt.driver [None req-2954a21b-0e9e-41ad-b62a-b124a811b049 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Deleting instance files /var/lib/nova/instances/c52ee407-1afb-4ae3-ae7f-792592e6badf_del
Jan 26 15:19:09 compute-1 nova_compute[183403]: 2026-01-26 15:19:09.425 183407 INFO nova.virt.libvirt.driver [None req-2954a21b-0e9e-41ad-b62a-b124a811b049 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Deletion of /var/lib/nova/instances/c52ee407-1afb-4ae3-ae7f-792592e6badf_del complete
Jan 26 15:19:09 compute-1 nova_compute[183403]: 2026-01-26 15:19:09.523 183407 DEBUG nova.compute.manager [req-42798822-0756-482e-a836-18bbd2d01299 req-8ea94762-1121-4945-821e-5edabfb52ecd 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Received event network-vif-unplugged-74d00ce8-8619-4c5a-a2f4-4018d57f1469 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:19:09 compute-1 nova_compute[183403]: 2026-01-26 15:19:09.523 183407 DEBUG oslo_concurrency.lockutils [req-42798822-0756-482e-a836-18bbd2d01299 req-8ea94762-1121-4945-821e-5edabfb52ecd 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "c52ee407-1afb-4ae3-ae7f-792592e6badf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:19:09 compute-1 nova_compute[183403]: 2026-01-26 15:19:09.524 183407 DEBUG oslo_concurrency.lockutils [req-42798822-0756-482e-a836-18bbd2d01299 req-8ea94762-1121-4945-821e-5edabfb52ecd 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "c52ee407-1afb-4ae3-ae7f-792592e6badf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:19:09 compute-1 nova_compute[183403]: 2026-01-26 15:19:09.524 183407 DEBUG oslo_concurrency.lockutils [req-42798822-0756-482e-a836-18bbd2d01299 req-8ea94762-1121-4945-821e-5edabfb52ecd 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "c52ee407-1afb-4ae3-ae7f-792592e6badf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:19:09 compute-1 nova_compute[183403]: 2026-01-26 15:19:09.524 183407 DEBUG nova.compute.manager [req-42798822-0756-482e-a836-18bbd2d01299 req-8ea94762-1121-4945-821e-5edabfb52ecd 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] No waiting events found dispatching network-vif-unplugged-74d00ce8-8619-4c5a-a2f4-4018d57f1469 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:19:09 compute-1 nova_compute[183403]: 2026-01-26 15:19:09.525 183407 DEBUG nova.compute.manager [req-42798822-0756-482e-a836-18bbd2d01299 req-8ea94762-1121-4945-821e-5edabfb52ecd 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Received event network-vif-unplugged-74d00ce8-8619-4c5a-a2f4-4018d57f1469 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 15:19:09 compute-1 nova_compute[183403]: 2026-01-26 15:19:09.899 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:19:09 compute-1 nova_compute[183403]: 2026-01-26 15:19:09.940 183407 INFO nova.compute.manager [None req-2954a21b-0e9e-41ad-b62a-b124a811b049 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Took 1.31 seconds to destroy the instance on the hypervisor.
Jan 26 15:19:09 compute-1 nova_compute[183403]: 2026-01-26 15:19:09.940 183407 DEBUG oslo.service.backend._eventlet.loopingcall [None req-2954a21b-0e9e-41ad-b62a-b124a811b049 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Jan 26 15:19:09 compute-1 nova_compute[183403]: 2026-01-26 15:19:09.941 183407 DEBUG nova.compute.manager [-] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Jan 26 15:19:09 compute-1 nova_compute[183403]: 2026-01-26 15:19:09.941 183407 DEBUG nova.network.neutron [-] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Jan 26 15:19:09 compute-1 nova_compute[183403]: 2026-01-26 15:19:09.941 183407 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:19:10 compute-1 nova_compute[183403]: 2026-01-26 15:19:10.141 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Error from libvirt while getting description of instance-0000000d: [Error Code 42] Domain not found: no domain with matching uuid 'c52ee407-1afb-4ae3-ae7f-792592e6badf' (instance-0000000d): libvirt.libvirtError: Domain not found: no domain with matching uuid 'c52ee407-1afb-4ae3-ae7f-792592e6badf' (instance-0000000d)
Jan 26 15:19:10 compute-1 nova_compute[183403]: 2026-01-26 15:19:10.146 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8e369b01-79a6-4f8a-bf56-148d715aaea5/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:19:10 compute-1 nova_compute[183403]: 2026-01-26 15:19:10.221 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8e369b01-79a6-4f8a-bf56-148d715aaea5/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:19:10 compute-1 nova_compute[183403]: 2026-01-26 15:19:10.223 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8e369b01-79a6-4f8a-bf56-148d715aaea5/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:19:10 compute-1 nova_compute[183403]: 2026-01-26 15:19:10.291 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8e369b01-79a6-4f8a-bf56-148d715aaea5/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:19:10 compute-1 nova_compute[183403]: 2026-01-26 15:19:10.404 183407 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:19:10 compute-1 nova_compute[183403]: 2026-01-26 15:19:10.442 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:19:10 compute-1 nova_compute[183403]: 2026-01-26 15:19:10.443 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:19:10 compute-1 nova_compute[183403]: 2026-01-26 15:19:10.471 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.028s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:19:10 compute-1 nova_compute[183403]: 2026-01-26 15:19:10.472 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5672MB free_disk=73.091064453125GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:19:10 compute-1 nova_compute[183403]: 2026-01-26 15:19:10.472 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:19:10 compute-1 nova_compute[183403]: 2026-01-26 15:19:10.472 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:19:10 compute-1 nova_compute[183403]: 2026-01-26 15:19:10.807 183407 DEBUG nova.compute.manager [req-815dbeb3-2654-45b4-9def-ea4c2b15763b req-992379e4-1eff-4908-ba75-9082f0ae069a 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Received event network-vif-deleted-74d00ce8-8619-4c5a-a2f4-4018d57f1469 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:19:10 compute-1 nova_compute[183403]: 2026-01-26 15:19:10.808 183407 INFO nova.compute.manager [req-815dbeb3-2654-45b4-9def-ea4c2b15763b req-992379e4-1eff-4908-ba75-9082f0ae069a 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Neutron deleted interface 74d00ce8-8619-4c5a-a2f4-4018d57f1469; detaching it from the instance and deleting it from the info cache
Jan 26 15:19:10 compute-1 nova_compute[183403]: 2026-01-26 15:19:10.808 183407 DEBUG nova.network.neutron [req-815dbeb3-2654-45b4-9def-ea4c2b15763b req-992379e4-1eff-4908-ba75-9082f0ae069a 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:19:11 compute-1 nova_compute[183403]: 2026-01-26 15:19:11.217 183407 DEBUG nova.network.neutron [-] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:19:11 compute-1 nova_compute[183403]: 2026-01-26 15:19:11.319 183407 DEBUG nova.compute.manager [req-815dbeb3-2654-45b4-9def-ea4c2b15763b req-992379e4-1eff-4908-ba75-9082f0ae069a 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Detach interface failed, port_id=74d00ce8-8619-4c5a-a2f4-4018d57f1469, reason: Instance c52ee407-1afb-4ae3-ae7f-792592e6badf could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Jan 26 15:19:11 compute-1 nova_compute[183403]: 2026-01-26 15:19:11.723 183407 INFO nova.compute.manager [-] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Took 1.78 seconds to deallocate network for instance.
Jan 26 15:19:11 compute-1 nova_compute[183403]: 2026-01-26 15:19:11.810 183407 DEBUG nova.compute.manager [req-2f802bf4-cd63-4900-9641-319093145687 req-78fee7c6-9160-4bd1-a85f-5d31c853b5b5 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Received event network-vif-unplugged-74d00ce8-8619-4c5a-a2f4-4018d57f1469 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:19:11 compute-1 nova_compute[183403]: 2026-01-26 15:19:11.811 183407 DEBUG oslo_concurrency.lockutils [req-2f802bf4-cd63-4900-9641-319093145687 req-78fee7c6-9160-4bd1-a85f-5d31c853b5b5 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "c52ee407-1afb-4ae3-ae7f-792592e6badf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:19:11 compute-1 nova_compute[183403]: 2026-01-26 15:19:11.811 183407 DEBUG oslo_concurrency.lockutils [req-2f802bf4-cd63-4900-9641-319093145687 req-78fee7c6-9160-4bd1-a85f-5d31c853b5b5 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "c52ee407-1afb-4ae3-ae7f-792592e6badf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:19:11 compute-1 nova_compute[183403]: 2026-01-26 15:19:11.811 183407 DEBUG oslo_concurrency.lockutils [req-2f802bf4-cd63-4900-9641-319093145687 req-78fee7c6-9160-4bd1-a85f-5d31c853b5b5 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "c52ee407-1afb-4ae3-ae7f-792592e6badf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:19:11 compute-1 nova_compute[183403]: 2026-01-26 15:19:11.812 183407 DEBUG nova.compute.manager [req-2f802bf4-cd63-4900-9641-319093145687 req-78fee7c6-9160-4bd1-a85f-5d31c853b5b5 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] No waiting events found dispatching network-vif-unplugged-74d00ce8-8619-4c5a-a2f4-4018d57f1469 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:19:11 compute-1 nova_compute[183403]: 2026-01-26 15:19:11.812 183407 DEBUG nova.compute.manager [req-2f802bf4-cd63-4900-9641-319093145687 req-78fee7c6-9160-4bd1-a85f-5d31c853b5b5 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: c52ee407-1afb-4ae3-ae7f-792592e6badf] Received event network-vif-unplugged-74d00ce8-8619-4c5a-a2f4-4018d57f1469 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 15:19:12 compute-1 nova_compute[183403]: 2026-01-26 15:19:12.036 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Instance c52ee407-1afb-4ae3-ae7f-792592e6badf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 15:19:12 compute-1 nova_compute[183403]: 2026-01-26 15:19:12.037 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Instance 8e369b01-79a6-4f8a-bf56-148d715aaea5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 15:19:12 compute-1 nova_compute[183403]: 2026-01-26 15:19:12.037 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:19:12 compute-1 nova_compute[183403]: 2026-01-26 15:19:12.038 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:19:10 up  1:14,  0 user,  load average: 0.20, 0.18, 0.28\n', 'num_instances': '2', 'num_vm_active': '2', 'num_task_None': '1', 'num_os_type_None': '2', 'num_proj_3ed11f66f0de4f6191def09f65c67624': '2', 'io_workload': '0', 'num_task_deleting': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:19:12 compute-1 nova_compute[183403]: 2026-01-26 15:19:12.112 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:19:12 compute-1 nova_compute[183403]: 2026-01-26 15:19:12.322 183407 DEBUG oslo_concurrency.lockutils [None req-2954a21b-0e9e-41ad-b62a-b124a811b049 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:19:12 compute-1 nova_compute[183403]: 2026-01-26 15:19:12.625 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:19:12 compute-1 podman[208795]: 2026-01-26 15:19:12.908456158 +0000 UTC m=+0.070225463 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 26 15:19:13 compute-1 podman[208794]: 2026-01-26 15:19:13.001575544 +0000 UTC m=+0.168518583 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260120, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, managed_by=edpm_ansible)
Jan 26 15:19:13 compute-1 nova_compute[183403]: 2026-01-26 15:19:13.137 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:19:13 compute-1 nova_compute[183403]: 2026-01-26 15:19:13.138 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.666s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:19:13 compute-1 nova_compute[183403]: 2026-01-26 15:19:13.138 183407 DEBUG oslo_concurrency.lockutils [None req-2954a21b-0e9e-41ad-b62a-b124a811b049 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.816s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:19:13 compute-1 nova_compute[183403]: 2026-01-26 15:19:13.209 183407 DEBUG nova.compute.provider_tree [None req-2954a21b-0e9e-41ad-b62a-b124a811b049 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:19:13 compute-1 nova_compute[183403]: 2026-01-26 15:19:13.717 183407 DEBUG nova.scheduler.client.report [None req-2954a21b-0e9e-41ad-b62a-b124a811b049 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:19:14 compute-1 nova_compute[183403]: 2026-01-26 15:19:14.138 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:19:14 compute-1 nova_compute[183403]: 2026-01-26 15:19:14.139 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:19:14 compute-1 nova_compute[183403]: 2026-01-26 15:19:14.139 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:19:14 compute-1 nova_compute[183403]: 2026-01-26 15:19:14.139 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:19:14 compute-1 nova_compute[183403]: 2026-01-26 15:19:14.139 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 15:19:14 compute-1 nova_compute[183403]: 2026-01-26 15:19:14.228 183407 DEBUG oslo_concurrency.lockutils [None req-2954a21b-0e9e-41ad-b62a-b124a811b049 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.090s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:19:14 compute-1 nova_compute[183403]: 2026-01-26 15:19:14.266 183407 INFO nova.scheduler.client.report [None req-2954a21b-0e9e-41ad-b62a-b124a811b049 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Deleted allocations for instance c52ee407-1afb-4ae3-ae7f-792592e6badf
Jan 26 15:19:14 compute-1 nova_compute[183403]: 2026-01-26 15:19:14.420 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:19:14 compute-1 nova_compute[183403]: 2026-01-26 15:19:14.900 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:19:15 compute-1 nova_compute[183403]: 2026-01-26 15:19:15.295 183407 DEBUG oslo_concurrency.lockutils [None req-2954a21b-0e9e-41ad-b62a-b124a811b049 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Lock "c52ee407-1afb-4ae3-ae7f-792592e6badf" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.219s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:19:15 compute-1 nova_compute[183403]: 2026-01-26 15:19:15.573 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:19:17 compute-1 nova_compute[183403]: 2026-01-26 15:19:17.145 183407 DEBUG oslo_concurrency.lockutils [None req-23ffc20c-1d49-47ee-82af-875f84bbf555 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Acquiring lock "8e369b01-79a6-4f8a-bf56-148d715aaea5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:19:17 compute-1 nova_compute[183403]: 2026-01-26 15:19:17.146 183407 DEBUG oslo_concurrency.lockutils [None req-23ffc20c-1d49-47ee-82af-875f84bbf555 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Lock "8e369b01-79a6-4f8a-bf56-148d715aaea5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:19:17 compute-1 nova_compute[183403]: 2026-01-26 15:19:17.146 183407 DEBUG oslo_concurrency.lockutils [None req-23ffc20c-1d49-47ee-82af-875f84bbf555 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Acquiring lock "8e369b01-79a6-4f8a-bf56-148d715aaea5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:19:17 compute-1 nova_compute[183403]: 2026-01-26 15:19:17.147 183407 DEBUG oslo_concurrency.lockutils [None req-23ffc20c-1d49-47ee-82af-875f84bbf555 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Lock "8e369b01-79a6-4f8a-bf56-148d715aaea5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:19:17 compute-1 nova_compute[183403]: 2026-01-26 15:19:17.148 183407 DEBUG oslo_concurrency.lockutils [None req-23ffc20c-1d49-47ee-82af-875f84bbf555 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Lock "8e369b01-79a6-4f8a-bf56-148d715aaea5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:19:17 compute-1 nova_compute[183403]: 2026-01-26 15:19:17.165 183407 INFO nova.compute.manager [None req-23ffc20c-1d49-47ee-82af-875f84bbf555 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: 8e369b01-79a6-4f8a-bf56-148d715aaea5] Terminating instance
Jan 26 15:19:17 compute-1 nova_compute[183403]: 2026-01-26 15:19:17.575 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:19:17 compute-1 nova_compute[183403]: 2026-01-26 15:19:17.685 183407 DEBUG nova.compute.manager [None req-23ffc20c-1d49-47ee-82af-875f84bbf555 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: 8e369b01-79a6-4f8a-bf56-148d715aaea5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Jan 26 15:19:17 compute-1 kernel: tap3504cf38-e9 (unregistering): left promiscuous mode
Jan 26 15:19:17 compute-1 NetworkManager[55716]: <info>  [1769440757.7074] device (tap3504cf38-e9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 15:19:17 compute-1 nova_compute[183403]: 2026-01-26 15:19:17.713 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:19:17 compute-1 ovn_controller[95641]: 2026-01-26T15:19:17Z|00121|binding|INFO|Releasing lport 3504cf38-e909-4755-aa5c-6de44f9944b0 from this chassis (sb_readonly=0)
Jan 26 15:19:17 compute-1 ovn_controller[95641]: 2026-01-26T15:19:17Z|00122|binding|INFO|Setting lport 3504cf38-e909-4755-aa5c-6de44f9944b0 down in Southbound
Jan 26 15:19:17 compute-1 ovn_controller[95641]: 2026-01-26T15:19:17Z|00123|binding|INFO|Removing iface tap3504cf38-e9 ovn-installed in OVS
Jan 26 15:19:17 compute-1 nova_compute[183403]: 2026-01-26 15:19:17.717 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:19:17 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:19:17.724 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d3:34:ef 10.100.0.4'], port_security=['fa:16:3e:d3:34:ef 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '8e369b01-79a6-4f8a-bf56-148d715aaea5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0df777d6-b389-44bc-b166-8208ab926234', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ed11f66f0de4f6191def09f65c67624', 'neutron:revision_number': '14', 'neutron:security_group_ids': 'e54c8a16-0c91-4c9d-aa33-a10f9396fada', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ffea2f62-0986-47bd-a80c-89f1c9decf3f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], logical_port=3504cf38-e909-4755-aa5c-6de44f9944b0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:19:17 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:19:17.725 104930 INFO neutron.agent.ovn.metadata.agent [-] Port 3504cf38-e909-4755-aa5c-6de44f9944b0 in datapath 0df777d6-b389-44bc-b166-8208ab926234 unbound from our chassis
Jan 26 15:19:17 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:19:17.726 104930 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0df777d6-b389-44bc-b166-8208ab926234, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 15:19:17 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:19:17.727 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[58acbede-70e3-4321-8b49-7aaff4fb1d13]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:19:17 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:19:17.727 104930 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0df777d6-b389-44bc-b166-8208ab926234 namespace which is not needed anymore
Jan 26 15:19:17 compute-1 nova_compute[183403]: 2026-01-26 15:19:17.735 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:19:17 compute-1 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Jan 26 15:19:17 compute-1 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000c.scope: Consumed 3.073s CPU time.
Jan 26 15:19:17 compute-1 systemd-machined[154697]: Machine qemu-10-instance-0000000c terminated.
Jan 26 15:19:17 compute-1 neutron-haproxy-ovnmeta-0df777d6-b389-44bc-b166-8208ab926234[208488]: [NOTICE]   (208492) : haproxy version is 3.0.5-8e879a5
Jan 26 15:19:17 compute-1 neutron-haproxy-ovnmeta-0df777d6-b389-44bc-b166-8208ab926234[208488]: [NOTICE]   (208492) : path to executable is /usr/sbin/haproxy
Jan 26 15:19:17 compute-1 podman[208866]: 2026-01-26 15:19:17.884136363 +0000 UTC m=+0.042334332 container kill acf5f2eb61d24fcef76d2067b9156b2d1875c0c1b8909012e4f453d596c9bb08 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-0df777d6-b389-44bc-b166-8208ab926234, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0)
Jan 26 15:19:17 compute-1 neutron-haproxy-ovnmeta-0df777d6-b389-44bc-b166-8208ab926234[208488]: [WARNING]  (208492) : Exiting Master process...
Jan 26 15:19:17 compute-1 neutron-haproxy-ovnmeta-0df777d6-b389-44bc-b166-8208ab926234[208488]: [ALERT]    (208492) : Current worker (208494) exited with code 143 (Terminated)
Jan 26 15:19:17 compute-1 neutron-haproxy-ovnmeta-0df777d6-b389-44bc-b166-8208ab926234[208488]: [WARNING]  (208492) : All workers exited. Exiting... (0)
Jan 26 15:19:17 compute-1 systemd[1]: libpod-acf5f2eb61d24fcef76d2067b9156b2d1875c0c1b8909012e4f453d596c9bb08.scope: Deactivated successfully.
Jan 26 15:19:17 compute-1 nova_compute[183403]: 2026-01-26 15:19:17.894 183407 DEBUG nova.compute.manager [req-5e1ad974-32e7-4bec-8d1e-0d5df56df44f req-a22614fd-60ce-4869-8272-e341e3f9d15c 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8e369b01-79a6-4f8a-bf56-148d715aaea5] Received event network-vif-unplugged-3504cf38-e909-4755-aa5c-6de44f9944b0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:19:17 compute-1 nova_compute[183403]: 2026-01-26 15:19:17.895 183407 DEBUG oslo_concurrency.lockutils [req-5e1ad974-32e7-4bec-8d1e-0d5df56df44f req-a22614fd-60ce-4869-8272-e341e3f9d15c 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "8e369b01-79a6-4f8a-bf56-148d715aaea5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:19:17 compute-1 nova_compute[183403]: 2026-01-26 15:19:17.895 183407 DEBUG oslo_concurrency.lockutils [req-5e1ad974-32e7-4bec-8d1e-0d5df56df44f req-a22614fd-60ce-4869-8272-e341e3f9d15c 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "8e369b01-79a6-4f8a-bf56-148d715aaea5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:19:17 compute-1 nova_compute[183403]: 2026-01-26 15:19:17.895 183407 DEBUG oslo_concurrency.lockutils [req-5e1ad974-32e7-4bec-8d1e-0d5df56df44f req-a22614fd-60ce-4869-8272-e341e3f9d15c 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "8e369b01-79a6-4f8a-bf56-148d715aaea5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:19:17 compute-1 nova_compute[183403]: 2026-01-26 15:19:17.896 183407 DEBUG nova.compute.manager [req-5e1ad974-32e7-4bec-8d1e-0d5df56df44f req-a22614fd-60ce-4869-8272-e341e3f9d15c 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8e369b01-79a6-4f8a-bf56-148d715aaea5] No waiting events found dispatching network-vif-unplugged-3504cf38-e909-4755-aa5c-6de44f9944b0 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:19:17 compute-1 nova_compute[183403]: 2026-01-26 15:19:17.896 183407 DEBUG nova.compute.manager [req-5e1ad974-32e7-4bec-8d1e-0d5df56df44f req-a22614fd-60ce-4869-8272-e341e3f9d15c 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8e369b01-79a6-4f8a-bf56-148d715aaea5] Received event network-vif-unplugged-3504cf38-e909-4755-aa5c-6de44f9944b0 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 15:19:17 compute-1 nova_compute[183403]: 2026-01-26 15:19:17.910 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:19:17 compute-1 nova_compute[183403]: 2026-01-26 15:19:17.914 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:19:17 compute-1 podman[208882]: 2026-01-26 15:19:17.953487642 +0000 UTC m=+0.045800148 container died acf5f2eb61d24fcef76d2067b9156b2d1875c0c1b8909012e4f453d596c9bb08 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-0df777d6-b389-44bc-b166-8208ab926234, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20260120, tcib_managed=true)
Jan 26 15:19:17 compute-1 nova_compute[183403]: 2026-01-26 15:19:17.952 183407 INFO nova.virt.libvirt.driver [-] [instance: 8e369b01-79a6-4f8a-bf56-148d715aaea5] Instance destroyed successfully.
Jan 26 15:19:17 compute-1 nova_compute[183403]: 2026-01-26 15:19:17.953 183407 DEBUG nova.objects.instance [None req-23ffc20c-1d49-47ee-82af-875f84bbf555 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Lazy-loading 'resources' on Instance uuid 8e369b01-79a6-4f8a-bf56-148d715aaea5 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:19:18 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-acf5f2eb61d24fcef76d2067b9156b2d1875c0c1b8909012e4f453d596c9bb08-userdata-shm.mount: Deactivated successfully.
Jan 26 15:19:18 compute-1 systemd[1]: var-lib-containers-storage-overlay-54c90fd9e4f14f86a1e7485a0b062957c677212fb9460da19d06ec1155e35802-merged.mount: Deactivated successfully.
Jan 26 15:19:18 compute-1 podman[208882]: 2026-01-26 15:19:18.053217621 +0000 UTC m=+0.145530087 container cleanup acf5f2eb61d24fcef76d2067b9156b2d1875c0c1b8909012e4f453d596c9bb08 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-0df777d6-b389-44bc-b166-8208ab926234, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0)
Jan 26 15:19:18 compute-1 systemd[1]: libpod-conmon-acf5f2eb61d24fcef76d2067b9156b2d1875c0c1b8909012e4f453d596c9bb08.scope: Deactivated successfully.
Jan 26 15:19:18 compute-1 podman[208909]: 2026-01-26 15:19:18.14499813 +0000 UTC m=+0.181642876 container remove acf5f2eb61d24fcef76d2067b9156b2d1875c0c1b8909012e4f453d596c9bb08 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-0df777d6-b389-44bc-b166-8208ab926234, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260120, maintainer=OpenStack Kubernetes Operator team)
Jan 26 15:19:18 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:19:18.155 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[306770be-897b-44a0-a989-ef03a8da1d11]: (4, ("Mon Jan 26 03:19:17 PM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-0df777d6-b389-44bc-b166-8208ab926234 (acf5f2eb61d24fcef76d2067b9156b2d1875c0c1b8909012e4f453d596c9bb08)\nacf5f2eb61d24fcef76d2067b9156b2d1875c0c1b8909012e4f453d596c9bb08\nMon Jan 26 03:19:17 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-0df777d6-b389-44bc-b166-8208ab926234 (acf5f2eb61d24fcef76d2067b9156b2d1875c0c1b8909012e4f453d596c9bb08)\nacf5f2eb61d24fcef76d2067b9156b2d1875c0c1b8909012e4f453d596c9bb08\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:19:18 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:19:18.157 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[ee507c62-6bff-4023-b060-cf13d732dc63]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:19:18 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:19:18.157 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0df777d6-b389-44bc-b166-8208ab926234.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0df777d6-b389-44bc-b166-8208ab926234.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:19:18 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:19:18.158 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[5185fe29-dc2b-464b-9d12-7cccc3e34bcc]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:19:18 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:19:18.159 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0df777d6-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:19:18 compute-1 nova_compute[183403]: 2026-01-26 15:19:18.161 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:19:18 compute-1 kernel: tap0df777d6-b0: left promiscuous mode
Jan 26 15:19:18 compute-1 nova_compute[183403]: 2026-01-26 15:19:18.191 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:19:18 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:19:18.194 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[c0ec402f-0129-4620-b692-1022644f3de8]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:19:18 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:19:18.211 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[5b514931-0668-400a-a549-a27896622ec6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:19:18 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:19:18.213 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[d326786e-73ad-4a93-9439-7cdaffdef53b]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:19:18 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:19:18.231 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[9fb97657-e7af-4b46-9715-037c3c7561a1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 441467, 'reachable_time': 35087, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 208929, 'error': None, 'target': 'ovnmeta-0df777d6-b389-44bc-b166-8208ab926234', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:19:18 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:19:18.233 105448 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0df777d6-b389-44bc-b166-8208ab926234 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Jan 26 15:19:18 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:19:18.234 105448 DEBUG oslo.privsep.daemon [-] privsep: reply[75b47933-5006-4605-8518-c4070768494e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:19:18 compute-1 systemd[1]: run-netns-ovnmeta\x2d0df777d6\x2db389\x2d44bc\x2db166\x2d8208ab926234.mount: Deactivated successfully.
Jan 26 15:19:18 compute-1 nova_compute[183403]: 2026-01-26 15:19:18.459 183407 DEBUG nova.virt.libvirt.vif [None req-23ffc20c-1d49-47ee-82af-875f84bbf555 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2026-01-26T15:17:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-794224949',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-794224949',id=12,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T15:17:54Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3ed11f66f0de4f6191def09f65c67624',ramdisk_id='',reservation_id='r-dc35661j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',clean_attempts='1',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1844876463',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1844876463-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T15:18:55Z,user_data=None,user_id='eabb3af6e41e4d9e883fc43bd03679db',uuid=8e369b01-79a6-4f8a-bf56-148d715aaea5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3504cf38-e909-4755-aa5c-6de44f9944b0", "address": "fa:16:3e:d3:34:ef", "network": {"id": "0df777d6-b389-44bc-b166-8208ab926234", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1259048565-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2d0174cdadae4803981796a2cea457d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3504cf38-e9", "ovs_interfaceid": "3504cf38-e909-4755-aa5c-6de44f9944b0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 26 15:19:18 compute-1 nova_compute[183403]: 2026-01-26 15:19:18.460 183407 DEBUG nova.network.os_vif_util [None req-23ffc20c-1d49-47ee-82af-875f84bbf555 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Converting VIF {"id": "3504cf38-e909-4755-aa5c-6de44f9944b0", "address": "fa:16:3e:d3:34:ef", "network": {"id": "0df777d6-b389-44bc-b166-8208ab926234", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1259048565-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2d0174cdadae4803981796a2cea457d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3504cf38-e9", "ovs_interfaceid": "3504cf38-e909-4755-aa5c-6de44f9944b0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:19:18 compute-1 nova_compute[183403]: 2026-01-26 15:19:18.461 183407 DEBUG nova.network.os_vif_util [None req-23ffc20c-1d49-47ee-82af-875f84bbf555 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d3:34:ef,bridge_name='br-int',has_traffic_filtering=True,id=3504cf38-e909-4755-aa5c-6de44f9944b0,network=Network(0df777d6-b389-44bc-b166-8208ab926234),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3504cf38-e9') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:19:18 compute-1 nova_compute[183403]: 2026-01-26 15:19:18.462 183407 DEBUG os_vif [None req-23ffc20c-1d49-47ee-82af-875f84bbf555 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d3:34:ef,bridge_name='br-int',has_traffic_filtering=True,id=3504cf38-e909-4755-aa5c-6de44f9944b0,network=Network(0df777d6-b389-44bc-b166-8208ab926234),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3504cf38-e9') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 26 15:19:18 compute-1 nova_compute[183403]: 2026-01-26 15:19:18.464 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:19:18 compute-1 nova_compute[183403]: 2026-01-26 15:19:18.464 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3504cf38-e9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:19:18 compute-1 nova_compute[183403]: 2026-01-26 15:19:18.517 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:19:18 compute-1 nova_compute[183403]: 2026-01-26 15:19:18.518 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:19:18 compute-1 nova_compute[183403]: 2026-01-26 15:19:18.519 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:19:18 compute-1 nova_compute[183403]: 2026-01-26 15:19:18.519 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=9e1ae6db-d9d0-4454-9129-4ff5130fc60f) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:19:18 compute-1 nova_compute[183403]: 2026-01-26 15:19:18.520 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:19:18 compute-1 nova_compute[183403]: 2026-01-26 15:19:18.521 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:19:18 compute-1 nova_compute[183403]: 2026-01-26 15:19:18.522 183407 INFO os_vif [None req-23ffc20c-1d49-47ee-82af-875f84bbf555 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d3:34:ef,bridge_name='br-int',has_traffic_filtering=True,id=3504cf38-e909-4755-aa5c-6de44f9944b0,network=Network(0df777d6-b389-44bc-b166-8208ab926234),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3504cf38-e9')
Jan 26 15:19:18 compute-1 nova_compute[183403]: 2026-01-26 15:19:18.523 183407 INFO nova.virt.libvirt.driver [None req-23ffc20c-1d49-47ee-82af-875f84bbf555 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: 8e369b01-79a6-4f8a-bf56-148d715aaea5] Deleting instance files /var/lib/nova/instances/8e369b01-79a6-4f8a-bf56-148d715aaea5_del
Jan 26 15:19:18 compute-1 nova_compute[183403]: 2026-01-26 15:19:18.523 183407 INFO nova.virt.libvirt.driver [None req-23ffc20c-1d49-47ee-82af-875f84bbf555 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: 8e369b01-79a6-4f8a-bf56-148d715aaea5] Deletion of /var/lib/nova/instances/8e369b01-79a6-4f8a-bf56-148d715aaea5_del complete
Jan 26 15:19:19 compute-1 nova_compute[183403]: 2026-01-26 15:19:19.035 183407 INFO nova.compute.manager [None req-23ffc20c-1d49-47ee-82af-875f84bbf555 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: 8e369b01-79a6-4f8a-bf56-148d715aaea5] Took 1.35 seconds to destroy the instance on the hypervisor.
Jan 26 15:19:19 compute-1 nova_compute[183403]: 2026-01-26 15:19:19.035 183407 DEBUG oslo.service.backend._eventlet.loopingcall [None req-23ffc20c-1d49-47ee-82af-875f84bbf555 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Jan 26 15:19:19 compute-1 nova_compute[183403]: 2026-01-26 15:19:19.036 183407 DEBUG nova.compute.manager [-] [instance: 8e369b01-79a6-4f8a-bf56-148d715aaea5] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Jan 26 15:19:19 compute-1 nova_compute[183403]: 2026-01-26 15:19:19.036 183407 DEBUG nova.network.neutron [-] [instance: 8e369b01-79a6-4f8a-bf56-148d715aaea5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Jan 26 15:19:19 compute-1 nova_compute[183403]: 2026-01-26 15:19:19.036 183407 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:19:19 compute-1 nova_compute[183403]: 2026-01-26 15:19:19.402 183407 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:19:19 compute-1 openstack_network_exporter[195610]: ERROR   15:19:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:19:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:19:19 compute-1 openstack_network_exporter[195610]: ERROR   15:19:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:19:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:19:19 compute-1 nova_compute[183403]: 2026-01-26 15:19:19.901 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:19:20 compute-1 nova_compute[183403]: 2026-01-26 15:19:20.055 183407 DEBUG nova.compute.manager [req-3b749865-8e02-4c22-bab1-312d2600e7ca req-025fbe85-00ea-43d6-8363-584e82814ce4 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8e369b01-79a6-4f8a-bf56-148d715aaea5] Received event network-vif-unplugged-3504cf38-e909-4755-aa5c-6de44f9944b0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:19:20 compute-1 nova_compute[183403]: 2026-01-26 15:19:20.056 183407 DEBUG oslo_concurrency.lockutils [req-3b749865-8e02-4c22-bab1-312d2600e7ca req-025fbe85-00ea-43d6-8363-584e82814ce4 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "8e369b01-79a6-4f8a-bf56-148d715aaea5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:19:20 compute-1 nova_compute[183403]: 2026-01-26 15:19:20.056 183407 DEBUG oslo_concurrency.lockutils [req-3b749865-8e02-4c22-bab1-312d2600e7ca req-025fbe85-00ea-43d6-8363-584e82814ce4 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "8e369b01-79a6-4f8a-bf56-148d715aaea5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:19:20 compute-1 nova_compute[183403]: 2026-01-26 15:19:20.057 183407 DEBUG oslo_concurrency.lockutils [req-3b749865-8e02-4c22-bab1-312d2600e7ca req-025fbe85-00ea-43d6-8363-584e82814ce4 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "8e369b01-79a6-4f8a-bf56-148d715aaea5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:19:20 compute-1 nova_compute[183403]: 2026-01-26 15:19:20.057 183407 DEBUG nova.compute.manager [req-3b749865-8e02-4c22-bab1-312d2600e7ca req-025fbe85-00ea-43d6-8363-584e82814ce4 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8e369b01-79a6-4f8a-bf56-148d715aaea5] No waiting events found dispatching network-vif-unplugged-3504cf38-e909-4755-aa5c-6de44f9944b0 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:19:20 compute-1 nova_compute[183403]: 2026-01-26 15:19:20.057 183407 DEBUG nova.compute.manager [req-3b749865-8e02-4c22-bab1-312d2600e7ca req-025fbe85-00ea-43d6-8363-584e82814ce4 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8e369b01-79a6-4f8a-bf56-148d715aaea5] Received event network-vif-unplugged-3504cf38-e909-4755-aa5c-6de44f9944b0 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 15:19:20 compute-1 nova_compute[183403]: 2026-01-26 15:19:20.780 183407 DEBUG nova.network.neutron [-] [instance: 8e369b01-79a6-4f8a-bf56-148d715aaea5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:19:21 compute-1 nova_compute[183403]: 2026-01-26 15:19:21.287 183407 INFO nova.compute.manager [-] [instance: 8e369b01-79a6-4f8a-bf56-148d715aaea5] Took 2.25 seconds to deallocate network for instance.
Jan 26 15:19:21 compute-1 nova_compute[183403]: 2026-01-26 15:19:21.809 183407 DEBUG oslo_concurrency.lockutils [None req-23ffc20c-1d49-47ee-82af-875f84bbf555 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:19:21 compute-1 nova_compute[183403]: 2026-01-26 15:19:21.810 183407 DEBUG oslo_concurrency.lockutils [None req-23ffc20c-1d49-47ee-82af-875f84bbf555 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:19:21 compute-1 nova_compute[183403]: 2026-01-26 15:19:21.888 183407 DEBUG nova.compute.provider_tree [None req-23ffc20c-1d49-47ee-82af-875f84bbf555 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:19:22 compute-1 nova_compute[183403]: 2026-01-26 15:19:22.129 183407 DEBUG nova.compute.manager [req-cc9473fc-fe2d-42c2-a681-91424ce5f1b5 req-082e8f7b-72b1-403d-bceb-184bf4b7ecd6 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 8e369b01-79a6-4f8a-bf56-148d715aaea5] Received event network-vif-deleted-3504cf38-e909-4755-aa5c-6de44f9944b0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:19:22 compute-1 nova_compute[183403]: 2026-01-26 15:19:22.396 183407 DEBUG nova.scheduler.client.report [None req-23ffc20c-1d49-47ee-82af-875f84bbf555 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:19:23 compute-1 nova_compute[183403]: 2026-01-26 15:19:23.136 183407 DEBUG oslo_concurrency.lockutils [None req-23ffc20c-1d49-47ee-82af-875f84bbf555 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.326s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:19:23 compute-1 nova_compute[183403]: 2026-01-26 15:19:23.173 183407 INFO nova.scheduler.client.report [None req-23ffc20c-1d49-47ee-82af-875f84bbf555 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Deleted allocations for instance 8e369b01-79a6-4f8a-bf56-148d715aaea5
Jan 26 15:19:23 compute-1 nova_compute[183403]: 2026-01-26 15:19:23.558 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:19:24 compute-1 nova_compute[183403]: 2026-01-26 15:19:24.206 183407 DEBUG oslo_concurrency.lockutils [None req-23ffc20c-1d49-47ee-82af-875f84bbf555 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Lock "8e369b01-79a6-4f8a-bf56-148d715aaea5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.060s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:19:24 compute-1 nova_compute[183403]: 2026-01-26 15:19:24.902 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:19:28 compute-1 nova_compute[183403]: 2026-01-26 15:19:28.560 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:19:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:19:29.051 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:19:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:19:29.052 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:19:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:19:29.052 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:19:29 compute-1 nova_compute[183403]: 2026-01-26 15:19:29.904 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:19:33 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:19:33.603 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:ac:38', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'de:96:40:90:f3:e5'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:19:33 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:19:33.604 104930 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 15:19:33 compute-1 nova_compute[183403]: 2026-01-26 15:19:33.610 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:19:34 compute-1 podman[208934]: 2026-01-26 15:19:34.881229247 +0000 UTC m=+0.058602113 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, release=1755695350, architecture=x86_64, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_id=openstack_network_exporter, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Jan 26 15:19:34 compute-1 nova_compute[183403]: 2026-01-26 15:19:34.904 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:19:34 compute-1 podman[208933]: 2026-01-26 15:19:34.906265439 +0000 UTC m=+0.085818373 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 15:19:35 compute-1 podman[192725]: time="2026-01-26T15:19:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:19:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:19:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:19:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:19:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2189 "" "Go-http-client/1.1"
Jan 26 15:19:37 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:19:37.606 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=41380b5a-e321-4ce4-bcc6-ecd563b3c793, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:19:38 compute-1 nova_compute[183403]: 2026-01-26 15:19:38.613 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:19:39 compute-1 nova_compute[183403]: 2026-01-26 15:19:39.908 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:19:43 compute-1 nova_compute[183403]: 2026-01-26 15:19:43.615 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:19:43 compute-1 podman[208979]: 2026-01-26 15:19:43.709459943 +0000 UTC m=+0.069462252 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 26 15:19:43 compute-1 podman[208978]: 2026-01-26 15:19:43.75032227 +0000 UTC m=+0.109455732 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller)
Jan 26 15:19:44 compute-1 nova_compute[183403]: 2026-01-26 15:19:44.909 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:19:47 compute-1 nova_compute[183403]: 2026-01-26 15:19:47.668 183407 DEBUG oslo_concurrency.lockutils [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Acquiring lock "de27c520-d8b0-42de-8d97-f48ffcc94a7e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:19:47 compute-1 nova_compute[183403]: 2026-01-26 15:19:47.668 183407 DEBUG oslo_concurrency.lockutils [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Lock "de27c520-d8b0-42de-8d97-f48ffcc94a7e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:19:48 compute-1 nova_compute[183403]: 2026-01-26 15:19:48.174 183407 DEBUG nova.compute.manager [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Jan 26 15:19:48 compute-1 nova_compute[183403]: 2026-01-26 15:19:48.619 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:19:48 compute-1 nova_compute[183403]: 2026-01-26 15:19:48.736 183407 DEBUG oslo_concurrency.lockutils [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:19:48 compute-1 nova_compute[183403]: 2026-01-26 15:19:48.736 183407 DEBUG oslo_concurrency.lockutils [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:19:48 compute-1 nova_compute[183403]: 2026-01-26 15:19:48.744 183407 DEBUG nova.virt.hardware [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Jan 26 15:19:48 compute-1 nova_compute[183403]: 2026-01-26 15:19:48.745 183407 INFO nova.compute.claims [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Claim successful on node compute-1.ctlplane.example.com
Jan 26 15:19:49 compute-1 openstack_network_exporter[195610]: ERROR   15:19:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:19:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:19:49 compute-1 openstack_network_exporter[195610]: ERROR   15:19:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:19:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:19:49 compute-1 nova_compute[183403]: 2026-01-26 15:19:49.809 183407 DEBUG nova.compute.provider_tree [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:19:49 compute-1 nova_compute[183403]: 2026-01-26 15:19:49.910 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:19:50 compute-1 nova_compute[183403]: 2026-01-26 15:19:50.319 183407 DEBUG nova.scheduler.client.report [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:19:50 compute-1 nova_compute[183403]: 2026-01-26 15:19:50.829 183407 DEBUG oslo_concurrency.lockutils [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.092s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:19:50 compute-1 nova_compute[183403]: 2026-01-26 15:19:50.830 183407 DEBUG nova.compute.manager [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Jan 26 15:19:51 compute-1 nova_compute[183403]: 2026-01-26 15:19:51.343 183407 DEBUG nova.compute.manager [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Jan 26 15:19:51 compute-1 nova_compute[183403]: 2026-01-26 15:19:51.343 183407 DEBUG nova.network.neutron [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Jan 26 15:19:51 compute-1 nova_compute[183403]: 2026-01-26 15:19:51.344 183407 WARNING neutronclient.v2_0.client [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:19:51 compute-1 nova_compute[183403]: 2026-01-26 15:19:51.344 183407 WARNING neutronclient.v2_0.client [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:19:51 compute-1 nova_compute[183403]: 2026-01-26 15:19:51.851 183407 INFO nova.virt.libvirt.driver [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 15:19:52 compute-1 nova_compute[183403]: 2026-01-26 15:19:52.134 183407 DEBUG nova.network.neutron [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Successfully created port: 27dfabe3-07b8-4c53-a020-e39d337bc11c _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Jan 26 15:19:52 compute-1 nova_compute[183403]: 2026-01-26 15:19:52.362 183407 DEBUG nova.compute.manager [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Jan 26 15:19:53 compute-1 nova_compute[183403]: 2026-01-26 15:19:53.226 183407 DEBUG nova.network.neutron [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Successfully updated port: 27dfabe3-07b8-4c53-a020-e39d337bc11c _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Jan 26 15:19:53 compute-1 nova_compute[183403]: 2026-01-26 15:19:53.295 183407 DEBUG nova.compute.manager [req-ddb1039c-f1df-488d-a56e-6da7929f485a req-573af91a-a3ca-46c1-abb8-8152b6d5f98c 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Received event network-changed-27dfabe3-07b8-4c53-a020-e39d337bc11c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:19:53 compute-1 nova_compute[183403]: 2026-01-26 15:19:53.295 183407 DEBUG nova.compute.manager [req-ddb1039c-f1df-488d-a56e-6da7929f485a req-573af91a-a3ca-46c1-abb8-8152b6d5f98c 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Refreshing instance network info cache due to event network-changed-27dfabe3-07b8-4c53-a020-e39d337bc11c. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 26 15:19:53 compute-1 nova_compute[183403]: 2026-01-26 15:19:53.295 183407 DEBUG oslo_concurrency.lockutils [req-ddb1039c-f1df-488d-a56e-6da7929f485a req-573af91a-a3ca-46c1-abb8-8152b6d5f98c 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "refresh_cache-de27c520-d8b0-42de-8d97-f48ffcc94a7e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 15:19:53 compute-1 nova_compute[183403]: 2026-01-26 15:19:53.296 183407 DEBUG oslo_concurrency.lockutils [req-ddb1039c-f1df-488d-a56e-6da7929f485a req-573af91a-a3ca-46c1-abb8-8152b6d5f98c 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquired lock "refresh_cache-de27c520-d8b0-42de-8d97-f48ffcc94a7e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 15:19:53 compute-1 nova_compute[183403]: 2026-01-26 15:19:53.296 183407 DEBUG nova.network.neutron [req-ddb1039c-f1df-488d-a56e-6da7929f485a req-573af91a-a3ca-46c1-abb8-8152b6d5f98c 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Refreshing network info cache for port 27dfabe3-07b8-4c53-a020-e39d337bc11c _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 26 15:19:53 compute-1 nova_compute[183403]: 2026-01-26 15:19:53.389 183407 DEBUG nova.compute.manager [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Jan 26 15:19:53 compute-1 nova_compute[183403]: 2026-01-26 15:19:53.391 183407 DEBUG nova.virt.libvirt.driver [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Jan 26 15:19:53 compute-1 nova_compute[183403]: 2026-01-26 15:19:53.392 183407 INFO nova.virt.libvirt.driver [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Creating image(s)
Jan 26 15:19:53 compute-1 nova_compute[183403]: 2026-01-26 15:19:53.393 183407 DEBUG oslo_concurrency.lockutils [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Acquiring lock "/var/lib/nova/instances/de27c520-d8b0-42de-8d97-f48ffcc94a7e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:19:53 compute-1 nova_compute[183403]: 2026-01-26 15:19:53.393 183407 DEBUG oslo_concurrency.lockutils [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Lock "/var/lib/nova/instances/de27c520-d8b0-42de-8d97-f48ffcc94a7e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:19:53 compute-1 nova_compute[183403]: 2026-01-26 15:19:53.394 183407 DEBUG oslo_concurrency.lockutils [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Lock "/var/lib/nova/instances/de27c520-d8b0-42de-8d97-f48ffcc94a7e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:19:53 compute-1 nova_compute[183403]: 2026-01-26 15:19:53.396 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:19:53 compute-1 nova_compute[183403]: 2026-01-26 15:19:53.402 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:19:53 compute-1 nova_compute[183403]: 2026-01-26 15:19:53.404 183407 DEBUG oslo_concurrency.processutils [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:19:53 compute-1 nova_compute[183403]: 2026-01-26 15:19:53.475 183407 DEBUG oslo_concurrency.processutils [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:19:53 compute-1 nova_compute[183403]: 2026-01-26 15:19:53.476 183407 DEBUG oslo_concurrency.lockutils [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Acquiring lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:19:53 compute-1 nova_compute[183403]: 2026-01-26 15:19:53.477 183407 DEBUG oslo_concurrency.lockutils [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:19:53 compute-1 nova_compute[183403]: 2026-01-26 15:19:53.478 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:19:53 compute-1 nova_compute[183403]: 2026-01-26 15:19:53.484 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:19:53 compute-1 nova_compute[183403]: 2026-01-26 15:19:53.484 183407 DEBUG oslo_concurrency.processutils [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:19:53 compute-1 nova_compute[183403]: 2026-01-26 15:19:53.534 183407 DEBUG oslo_concurrency.processutils [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:19:53 compute-1 nova_compute[183403]: 2026-01-26 15:19:53.535 183407 DEBUG oslo_concurrency.processutils [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0,backing_fmt=raw /var/lib/nova/instances/de27c520-d8b0-42de-8d97-f48ffcc94a7e/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:19:53 compute-1 nova_compute[183403]: 2026-01-26 15:19:53.621 183407 DEBUG oslo_concurrency.processutils [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0,backing_fmt=raw /var/lib/nova/instances/de27c520-d8b0-42de-8d97-f48ffcc94a7e/disk 1073741824" returned: 0 in 0.085s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:19:53 compute-1 nova_compute[183403]: 2026-01-26 15:19:53.622 183407 DEBUG oslo_concurrency.lockutils [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.145s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:19:53 compute-1 nova_compute[183403]: 2026-01-26 15:19:53.623 183407 DEBUG oslo_concurrency.processutils [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:19:53 compute-1 nova_compute[183403]: 2026-01-26 15:19:53.634 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:19:53 compute-1 nova_compute[183403]: 2026-01-26 15:19:53.702 183407 DEBUG oslo_concurrency.processutils [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:19:53 compute-1 nova_compute[183403]: 2026-01-26 15:19:53.703 183407 DEBUG nova.virt.disk.api [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Checking if we can resize image /var/lib/nova/instances/de27c520-d8b0-42de-8d97-f48ffcc94a7e/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 26 15:19:53 compute-1 nova_compute[183403]: 2026-01-26 15:19:53.704 183407 DEBUG oslo_concurrency.processutils [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/de27c520-d8b0-42de-8d97-f48ffcc94a7e/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:19:53 compute-1 nova_compute[183403]: 2026-01-26 15:19:53.735 183407 DEBUG oslo_concurrency.lockutils [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Acquiring lock "refresh_cache-de27c520-d8b0-42de-8d97-f48ffcc94a7e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 15:19:53 compute-1 nova_compute[183403]: 2026-01-26 15:19:53.757 183407 DEBUG oslo_concurrency.processutils [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/de27c520-d8b0-42de-8d97-f48ffcc94a7e/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:19:53 compute-1 nova_compute[183403]: 2026-01-26 15:19:53.758 183407 DEBUG nova.virt.disk.api [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Cannot resize image /var/lib/nova/instances/de27c520-d8b0-42de-8d97-f48ffcc94a7e/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 26 15:19:53 compute-1 nova_compute[183403]: 2026-01-26 15:19:53.759 183407 DEBUG nova.virt.libvirt.driver [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Jan 26 15:19:53 compute-1 nova_compute[183403]: 2026-01-26 15:19:53.759 183407 DEBUG nova.virt.libvirt.driver [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Ensure instance console log exists: /var/lib/nova/instances/de27c520-d8b0-42de-8d97-f48ffcc94a7e/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Jan 26 15:19:53 compute-1 nova_compute[183403]: 2026-01-26 15:19:53.760 183407 DEBUG oslo_concurrency.lockutils [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:19:53 compute-1 nova_compute[183403]: 2026-01-26 15:19:53.760 183407 DEBUG oslo_concurrency.lockutils [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:19:53 compute-1 nova_compute[183403]: 2026-01-26 15:19:53.761 183407 DEBUG oslo_concurrency.lockutils [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:19:53 compute-1 nova_compute[183403]: 2026-01-26 15:19:53.802 183407 WARNING neutronclient.v2_0.client [req-ddb1039c-f1df-488d-a56e-6da7929f485a req-573af91a-a3ca-46c1-abb8-8152b6d5f98c 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:19:54 compute-1 nova_compute[183403]: 2026-01-26 15:19:54.557 183407 DEBUG nova.network.neutron [req-ddb1039c-f1df-488d-a56e-6da7929f485a req-573af91a-a3ca-46c1-abb8-8152b6d5f98c 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 15:19:54 compute-1 nova_compute[183403]: 2026-01-26 15:19:54.713 183407 DEBUG nova.network.neutron [req-ddb1039c-f1df-488d-a56e-6da7929f485a req-573af91a-a3ca-46c1-abb8-8152b6d5f98c 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:19:54 compute-1 nova_compute[183403]: 2026-01-26 15:19:54.914 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:19:55 compute-1 nova_compute[183403]: 2026-01-26 15:19:55.225 183407 DEBUG oslo_concurrency.lockutils [req-ddb1039c-f1df-488d-a56e-6da7929f485a req-573af91a-a3ca-46c1-abb8-8152b6d5f98c 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Releasing lock "refresh_cache-de27c520-d8b0-42de-8d97-f48ffcc94a7e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 15:19:55 compute-1 nova_compute[183403]: 2026-01-26 15:19:55.226 183407 DEBUG oslo_concurrency.lockutils [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Acquired lock "refresh_cache-de27c520-d8b0-42de-8d97-f48ffcc94a7e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 15:19:55 compute-1 nova_compute[183403]: 2026-01-26 15:19:55.226 183407 DEBUG nova.network.neutron [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 15:19:56 compute-1 nova_compute[183403]: 2026-01-26 15:19:56.556 183407 DEBUG nova.network.neutron [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 15:19:57 compute-1 nova_compute[183403]: 2026-01-26 15:19:57.550 183407 WARNING neutronclient.v2_0.client [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:19:57 compute-1 nova_compute[183403]: 2026-01-26 15:19:57.799 183407 DEBUG nova.network.neutron [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Updating instance_info_cache with network_info: [{"id": "27dfabe3-07b8-4c53-a020-e39d337bc11c", "address": "fa:16:3e:67:fb:7d", "network": {"id": "0df777d6-b389-44bc-b166-8208ab926234", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1259048565-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2d0174cdadae4803981796a2cea457d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27dfabe3-07", "ovs_interfaceid": "27dfabe3-07b8-4c53-a020-e39d337bc11c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:19:58 compute-1 nova_compute[183403]: 2026-01-26 15:19:58.306 183407 DEBUG oslo_concurrency.lockutils [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Releasing lock "refresh_cache-de27c520-d8b0-42de-8d97-f48ffcc94a7e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 15:19:58 compute-1 nova_compute[183403]: 2026-01-26 15:19:58.307 183407 DEBUG nova.compute.manager [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Instance network_info: |[{"id": "27dfabe3-07b8-4c53-a020-e39d337bc11c", "address": "fa:16:3e:67:fb:7d", "network": {"id": "0df777d6-b389-44bc-b166-8208ab926234", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1259048565-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2d0174cdadae4803981796a2cea457d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27dfabe3-07", "ovs_interfaceid": "27dfabe3-07b8-4c53-a020-e39d337bc11c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Jan 26 15:19:58 compute-1 nova_compute[183403]: 2026-01-26 15:19:58.311 183407 DEBUG nova.virt.libvirt.driver [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Start _get_guest_xml network_info=[{"id": "27dfabe3-07b8-4c53-a020-e39d337bc11c", "address": "fa:16:3e:67:fb:7d", "network": {"id": "0df777d6-b389-44bc-b166-8208ab926234", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1259048565-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2d0174cdadae4803981796a2cea457d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27dfabe3-07", "ovs_interfaceid": "27dfabe3-07b8-4c53-a020-e39d337bc11c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T15:01:22Z,direct_url=<?>,disk_format='qcow2',id=354e4d0e-4287-404f-93d3-2c85cfe92fbc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='179f3c996d8f4e7ea1b0aca3ec76f02e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T15:01:23Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'image_id': '354e4d0e-4287-404f-93d3-2c85cfe92fbc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Jan 26 15:19:58 compute-1 nova_compute[183403]: 2026-01-26 15:19:58.317 183407 WARNING nova.virt.libvirt.driver [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:19:58 compute-1 nova_compute[183403]: 2026-01-26 15:19:58.319 183407 DEBUG nova.virt.driver [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='354e4d0e-4287-404f-93d3-2c85cfe92fbc', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteHostMaintenanceStrategy-server-1440107530', uuid='de27c520-d8b0-42de-8d97-f48ffcc94a7e'), owner=OwnerMeta(userid='eabb3af6e41e4d9e883fc43bd03679db', username='tempest-TestExecuteHostMaintenanceStrategy-1844876463-project-admin', projectid='3ed11f66f0de4f6191def09f65c67624', projectname='tempest-TestExecuteHostMaintenanceStrategy-1844876463'), image=ImageMeta(id='354e4d0e-4287-404f-93d3-2c85cfe92fbc', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='74480e15-23e6-4569-8ef9-3ddf5ac8b981', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "27dfabe3-07b8-4c53-a020-e39d337bc11c", "address": "fa:16:3e:67:fb:7d", "network": {"id": "0df777d6-b389-44bc-b166-8208ab926234", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1259048565-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2d0174cdadae4803981796a2cea457d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27dfabe3-07", "ovs_interfaceid": "27dfabe3-07b8-4c53-a020-e39d337bc11c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1769440798.3197365) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Jan 26 15:19:58 compute-1 nova_compute[183403]: 2026-01-26 15:19:58.324 183407 DEBUG nova.virt.libvirt.host [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Jan 26 15:19:58 compute-1 nova_compute[183403]: 2026-01-26 15:19:58.325 183407 DEBUG nova.virt.libvirt.host [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Jan 26 15:19:58 compute-1 nova_compute[183403]: 2026-01-26 15:19:58.328 183407 DEBUG nova.virt.libvirt.host [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Jan 26 15:19:58 compute-1 nova_compute[183403]: 2026-01-26 15:19:58.329 183407 DEBUG nova.virt.libvirt.host [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Jan 26 15:19:58 compute-1 nova_compute[183403]: 2026-01-26 15:19:58.331 183407 DEBUG nova.virt.libvirt.driver [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Jan 26 15:19:58 compute-1 nova_compute[183403]: 2026-01-26 15:19:58.331 183407 DEBUG nova.virt.hardware [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T15:01:21Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='74480e15-23e6-4569-8ef9-3ddf5ac8b981',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T15:01:22Z,direct_url=<?>,disk_format='qcow2',id=354e4d0e-4287-404f-93d3-2c85cfe92fbc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='179f3c996d8f4e7ea1b0aca3ec76f02e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T15:01:23Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Jan 26 15:19:58 compute-1 nova_compute[183403]: 2026-01-26 15:19:58.332 183407 DEBUG nova.virt.hardware [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Jan 26 15:19:58 compute-1 nova_compute[183403]: 2026-01-26 15:19:58.332 183407 DEBUG nova.virt.hardware [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Jan 26 15:19:58 compute-1 nova_compute[183403]: 2026-01-26 15:19:58.333 183407 DEBUG nova.virt.hardware [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Jan 26 15:19:58 compute-1 nova_compute[183403]: 2026-01-26 15:19:58.333 183407 DEBUG nova.virt.hardware [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Jan 26 15:19:58 compute-1 nova_compute[183403]: 2026-01-26 15:19:58.334 183407 DEBUG nova.virt.hardware [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Jan 26 15:19:58 compute-1 nova_compute[183403]: 2026-01-26 15:19:58.334 183407 DEBUG nova.virt.hardware [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Jan 26 15:19:58 compute-1 nova_compute[183403]: 2026-01-26 15:19:58.334 183407 DEBUG nova.virt.hardware [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Jan 26 15:19:58 compute-1 nova_compute[183403]: 2026-01-26 15:19:58.335 183407 DEBUG nova.virt.hardware [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Jan 26 15:19:58 compute-1 nova_compute[183403]: 2026-01-26 15:19:58.335 183407 DEBUG nova.virt.hardware [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Jan 26 15:19:58 compute-1 nova_compute[183403]: 2026-01-26 15:19:58.336 183407 DEBUG nova.virt.hardware [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Jan 26 15:19:58 compute-1 nova_compute[183403]: 2026-01-26 15:19:58.342 183407 DEBUG nova.virt.libvirt.vif [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-01-26T15:19:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1440107530',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1440107530',id=15,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3ed11f66f0de4f6191def09f65c67624',ramdisk_id='',reservation_id='r-01twax98',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1844876463',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1844876463-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T15:19:52Z,user_data=None,user_id='eabb3af6e41e4d9e883fc43bd03679db',uuid=de27c520-d8b0-42de-8d97-f48ffcc94a7e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "27dfabe3-07b8-4c53-a020-e39d337bc11c", "address": "fa:16:3e:67:fb:7d", "network": {"id": "0df777d6-b389-44bc-b166-8208ab926234", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1259048565-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2d0174cdadae4803981796a2cea457d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27dfabe3-07", "ovs_interfaceid": "27dfabe3-07b8-4c53-a020-e39d337bc11c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 26 15:19:58 compute-1 nova_compute[183403]: 2026-01-26 15:19:58.343 183407 DEBUG nova.network.os_vif_util [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Converting VIF {"id": "27dfabe3-07b8-4c53-a020-e39d337bc11c", "address": "fa:16:3e:67:fb:7d", "network": {"id": "0df777d6-b389-44bc-b166-8208ab926234", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1259048565-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2d0174cdadae4803981796a2cea457d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27dfabe3-07", "ovs_interfaceid": "27dfabe3-07b8-4c53-a020-e39d337bc11c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:19:58 compute-1 nova_compute[183403]: 2026-01-26 15:19:58.344 183407 DEBUG nova.network.os_vif_util [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:67:fb:7d,bridge_name='br-int',has_traffic_filtering=True,id=27dfabe3-07b8-4c53-a020-e39d337bc11c,network=Network(0df777d6-b389-44bc-b166-8208ab926234),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27dfabe3-07') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:19:58 compute-1 nova_compute[183403]: 2026-01-26 15:19:58.345 183407 DEBUG nova.objects.instance [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Lazy-loading 'pci_devices' on Instance uuid de27c520-d8b0-42de-8d97-f48ffcc94a7e obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:19:58 compute-1 nova_compute[183403]: 2026-01-26 15:19:58.639 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:19:58 compute-1 nova_compute[183403]: 2026-01-26 15:19:58.853 183407 DEBUG nova.virt.libvirt.driver [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] End _get_guest_xml xml=<domain type="kvm">
Jan 26 15:19:58 compute-1 nova_compute[183403]:   <uuid>de27c520-d8b0-42de-8d97-f48ffcc94a7e</uuid>
Jan 26 15:19:58 compute-1 nova_compute[183403]:   <name>instance-0000000f</name>
Jan 26 15:19:58 compute-1 nova_compute[183403]:   <memory>131072</memory>
Jan 26 15:19:58 compute-1 nova_compute[183403]:   <vcpu>1</vcpu>
Jan 26 15:19:58 compute-1 nova_compute[183403]:   <metadata>
Jan 26 15:19:58 compute-1 nova_compute[183403]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 15:19:58 compute-1 nova_compute[183403]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 15:19:58 compute-1 nova_compute[183403]:       <nova:name>tempest-TestExecuteHostMaintenanceStrategy-server-1440107530</nova:name>
Jan 26 15:19:58 compute-1 nova_compute[183403]:       <nova:creationTime>2026-01-26 15:19:58</nova:creationTime>
Jan 26 15:19:58 compute-1 nova_compute[183403]:       <nova:flavor name="m1.nano" id="74480e15-23e6-4569-8ef9-3ddf5ac8b981">
Jan 26 15:19:58 compute-1 nova_compute[183403]:         <nova:memory>128</nova:memory>
Jan 26 15:19:58 compute-1 nova_compute[183403]:         <nova:disk>1</nova:disk>
Jan 26 15:19:58 compute-1 nova_compute[183403]:         <nova:swap>0</nova:swap>
Jan 26 15:19:58 compute-1 nova_compute[183403]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 15:19:58 compute-1 nova_compute[183403]:         <nova:vcpus>1</nova:vcpus>
Jan 26 15:19:58 compute-1 nova_compute[183403]:         <nova:extraSpecs>
Jan 26 15:19:58 compute-1 nova_compute[183403]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 15:19:58 compute-1 nova_compute[183403]:         </nova:extraSpecs>
Jan 26 15:19:58 compute-1 nova_compute[183403]:       </nova:flavor>
Jan 26 15:19:58 compute-1 nova_compute[183403]:       <nova:image uuid="354e4d0e-4287-404f-93d3-2c85cfe92fbc">
Jan 26 15:19:58 compute-1 nova_compute[183403]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 15:19:58 compute-1 nova_compute[183403]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 15:19:58 compute-1 nova_compute[183403]:         <nova:minDisk>1</nova:minDisk>
Jan 26 15:19:58 compute-1 nova_compute[183403]:         <nova:minRam>0</nova:minRam>
Jan 26 15:19:58 compute-1 nova_compute[183403]:         <nova:properties>
Jan 26 15:19:58 compute-1 nova_compute[183403]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 15:19:58 compute-1 nova_compute[183403]:         </nova:properties>
Jan 26 15:19:58 compute-1 nova_compute[183403]:       </nova:image>
Jan 26 15:19:58 compute-1 nova_compute[183403]:       <nova:owner>
Jan 26 15:19:58 compute-1 nova_compute[183403]:         <nova:user uuid="eabb3af6e41e4d9e883fc43bd03679db">tempest-TestExecuteHostMaintenanceStrategy-1844876463-project-admin</nova:user>
Jan 26 15:19:58 compute-1 nova_compute[183403]:         <nova:project uuid="3ed11f66f0de4f6191def09f65c67624">tempest-TestExecuteHostMaintenanceStrategy-1844876463</nova:project>
Jan 26 15:19:58 compute-1 nova_compute[183403]:       </nova:owner>
Jan 26 15:19:58 compute-1 nova_compute[183403]:       <nova:root type="image" uuid="354e4d0e-4287-404f-93d3-2c85cfe92fbc"/>
Jan 26 15:19:58 compute-1 nova_compute[183403]:       <nova:ports>
Jan 26 15:19:58 compute-1 nova_compute[183403]:         <nova:port uuid="27dfabe3-07b8-4c53-a020-e39d337bc11c">
Jan 26 15:19:58 compute-1 nova_compute[183403]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 26 15:19:58 compute-1 nova_compute[183403]:         </nova:port>
Jan 26 15:19:58 compute-1 nova_compute[183403]:       </nova:ports>
Jan 26 15:19:58 compute-1 nova_compute[183403]:     </nova:instance>
Jan 26 15:19:58 compute-1 nova_compute[183403]:   </metadata>
Jan 26 15:19:58 compute-1 nova_compute[183403]:   <sysinfo type="smbios">
Jan 26 15:19:58 compute-1 nova_compute[183403]:     <system>
Jan 26 15:19:58 compute-1 nova_compute[183403]:       <entry name="manufacturer">RDO</entry>
Jan 26 15:19:58 compute-1 nova_compute[183403]:       <entry name="product">OpenStack Compute</entry>
Jan 26 15:19:58 compute-1 nova_compute[183403]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 15:19:58 compute-1 nova_compute[183403]:       <entry name="serial">de27c520-d8b0-42de-8d97-f48ffcc94a7e</entry>
Jan 26 15:19:58 compute-1 nova_compute[183403]:       <entry name="uuid">de27c520-d8b0-42de-8d97-f48ffcc94a7e</entry>
Jan 26 15:19:58 compute-1 nova_compute[183403]:       <entry name="family">Virtual Machine</entry>
Jan 26 15:19:58 compute-1 nova_compute[183403]:     </system>
Jan 26 15:19:58 compute-1 nova_compute[183403]:   </sysinfo>
Jan 26 15:19:58 compute-1 nova_compute[183403]:   <os>
Jan 26 15:19:58 compute-1 nova_compute[183403]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 15:19:58 compute-1 nova_compute[183403]:     <boot dev="hd"/>
Jan 26 15:19:58 compute-1 nova_compute[183403]:     <smbios mode="sysinfo"/>
Jan 26 15:19:58 compute-1 nova_compute[183403]:   </os>
Jan 26 15:19:58 compute-1 nova_compute[183403]:   <features>
Jan 26 15:19:58 compute-1 nova_compute[183403]:     <acpi/>
Jan 26 15:19:58 compute-1 nova_compute[183403]:     <apic/>
Jan 26 15:19:58 compute-1 nova_compute[183403]:     <vmcoreinfo/>
Jan 26 15:19:58 compute-1 nova_compute[183403]:   </features>
Jan 26 15:19:58 compute-1 nova_compute[183403]:   <clock offset="utc">
Jan 26 15:19:58 compute-1 nova_compute[183403]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 15:19:58 compute-1 nova_compute[183403]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 15:19:58 compute-1 nova_compute[183403]:     <timer name="hpet" present="no"/>
Jan 26 15:19:58 compute-1 nova_compute[183403]:   </clock>
Jan 26 15:19:58 compute-1 nova_compute[183403]:   <cpu mode="custom" match="exact">
Jan 26 15:19:58 compute-1 nova_compute[183403]:     <model>Nehalem</model>
Jan 26 15:19:58 compute-1 nova_compute[183403]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 15:19:58 compute-1 nova_compute[183403]:   </cpu>
Jan 26 15:19:58 compute-1 nova_compute[183403]:   <devices>
Jan 26 15:19:58 compute-1 nova_compute[183403]:     <disk type="file" device="disk">
Jan 26 15:19:58 compute-1 nova_compute[183403]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 15:19:58 compute-1 nova_compute[183403]:       <source file="/var/lib/nova/instances/de27c520-d8b0-42de-8d97-f48ffcc94a7e/disk"/>
Jan 26 15:19:58 compute-1 nova_compute[183403]:       <target dev="vda" bus="virtio"/>
Jan 26 15:19:58 compute-1 nova_compute[183403]:     </disk>
Jan 26 15:19:58 compute-1 nova_compute[183403]:     <disk type="file" device="cdrom">
Jan 26 15:19:58 compute-1 nova_compute[183403]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 15:19:58 compute-1 nova_compute[183403]:       <source file="/var/lib/nova/instances/de27c520-d8b0-42de-8d97-f48ffcc94a7e/disk.config"/>
Jan 26 15:19:58 compute-1 nova_compute[183403]:       <target dev="sda" bus="sata"/>
Jan 26 15:19:58 compute-1 nova_compute[183403]:     </disk>
Jan 26 15:19:58 compute-1 nova_compute[183403]:     <interface type="ethernet">
Jan 26 15:19:58 compute-1 nova_compute[183403]:       <mac address="fa:16:3e:67:fb:7d"/>
Jan 26 15:19:58 compute-1 nova_compute[183403]:       <model type="virtio"/>
Jan 26 15:19:58 compute-1 nova_compute[183403]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 15:19:58 compute-1 nova_compute[183403]:       <mtu size="1442"/>
Jan 26 15:19:58 compute-1 nova_compute[183403]:       <target dev="tap27dfabe3-07"/>
Jan 26 15:19:58 compute-1 nova_compute[183403]:     </interface>
Jan 26 15:19:58 compute-1 nova_compute[183403]:     <serial type="pty">
Jan 26 15:19:58 compute-1 nova_compute[183403]:       <log file="/var/lib/nova/instances/de27c520-d8b0-42de-8d97-f48ffcc94a7e/console.log" append="off"/>
Jan 26 15:19:58 compute-1 nova_compute[183403]:     </serial>
Jan 26 15:19:58 compute-1 nova_compute[183403]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 15:19:58 compute-1 nova_compute[183403]:     <video>
Jan 26 15:19:58 compute-1 nova_compute[183403]:       <model type="virtio"/>
Jan 26 15:19:58 compute-1 nova_compute[183403]:     </video>
Jan 26 15:19:58 compute-1 nova_compute[183403]:     <input type="tablet" bus="usb"/>
Jan 26 15:19:58 compute-1 nova_compute[183403]:     <rng model="virtio">
Jan 26 15:19:58 compute-1 nova_compute[183403]:       <backend model="random">/dev/urandom</backend>
Jan 26 15:19:58 compute-1 nova_compute[183403]:     </rng>
Jan 26 15:19:58 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root"/>
Jan 26 15:19:58 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:19:58 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:19:58 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:19:58 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:19:58 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:19:58 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:19:58 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:19:58 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:19:58 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:19:58 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:19:58 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:19:58 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:19:58 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:19:58 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:19:58 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:19:58 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:19:58 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:19:58 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:19:58 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:19:58 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:19:58 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:19:58 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:19:58 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:19:58 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:19:58 compute-1 nova_compute[183403]:     <controller type="usb" index="0"/>
Jan 26 15:19:58 compute-1 nova_compute[183403]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 15:19:58 compute-1 nova_compute[183403]:       <stats period="10"/>
Jan 26 15:19:58 compute-1 nova_compute[183403]:     </memballoon>
Jan 26 15:19:58 compute-1 nova_compute[183403]:   </devices>
Jan 26 15:19:58 compute-1 nova_compute[183403]: </domain>
Jan 26 15:19:58 compute-1 nova_compute[183403]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Jan 26 15:19:58 compute-1 nova_compute[183403]: 2026-01-26 15:19:58.854 183407 DEBUG nova.compute.manager [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Preparing to wait for external event network-vif-plugged-27dfabe3-07b8-4c53-a020-e39d337bc11c prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 26 15:19:58 compute-1 nova_compute[183403]: 2026-01-26 15:19:58.855 183407 DEBUG oslo_concurrency.lockutils [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Acquiring lock "de27c520-d8b0-42de-8d97-f48ffcc94a7e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:19:58 compute-1 nova_compute[183403]: 2026-01-26 15:19:58.856 183407 DEBUG oslo_concurrency.lockutils [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Lock "de27c520-d8b0-42de-8d97-f48ffcc94a7e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:19:58 compute-1 nova_compute[183403]: 2026-01-26 15:19:58.856 183407 DEBUG oslo_concurrency.lockutils [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Lock "de27c520-d8b0-42de-8d97-f48ffcc94a7e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:19:58 compute-1 nova_compute[183403]: 2026-01-26 15:19:58.857 183407 DEBUG nova.virt.libvirt.vif [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-01-26T15:19:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1440107530',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1440107530',id=15,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3ed11f66f0de4f6191def09f65c67624',ramdisk_id='',reservation_id='r-01twax98',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1844876463',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1844876463-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T15:19:52Z,user_data=None,user_id='eabb3af6e41e4d9e883fc43bd03679db',uuid=de27c520-d8b0-42de-8d97-f48ffcc94a7e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "27dfabe3-07b8-4c53-a020-e39d337bc11c", "address": "fa:16:3e:67:fb:7d", "network": {"id": "0df777d6-b389-44bc-b166-8208ab926234", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1259048565-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2d0174cdadae4803981796a2cea457d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27dfabe3-07", "ovs_interfaceid": "27dfabe3-07b8-4c53-a020-e39d337bc11c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 26 15:19:58 compute-1 nova_compute[183403]: 2026-01-26 15:19:58.858 183407 DEBUG nova.network.os_vif_util [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Converting VIF {"id": "27dfabe3-07b8-4c53-a020-e39d337bc11c", "address": "fa:16:3e:67:fb:7d", "network": {"id": "0df777d6-b389-44bc-b166-8208ab926234", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1259048565-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2d0174cdadae4803981796a2cea457d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27dfabe3-07", "ovs_interfaceid": "27dfabe3-07b8-4c53-a020-e39d337bc11c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:19:58 compute-1 nova_compute[183403]: 2026-01-26 15:19:58.859 183407 DEBUG nova.network.os_vif_util [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:67:fb:7d,bridge_name='br-int',has_traffic_filtering=True,id=27dfabe3-07b8-4c53-a020-e39d337bc11c,network=Network(0df777d6-b389-44bc-b166-8208ab926234),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27dfabe3-07') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:19:58 compute-1 nova_compute[183403]: 2026-01-26 15:19:58.859 183407 DEBUG os_vif [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:67:fb:7d,bridge_name='br-int',has_traffic_filtering=True,id=27dfabe3-07b8-4c53-a020-e39d337bc11c,network=Network(0df777d6-b389-44bc-b166-8208ab926234),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27dfabe3-07') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 26 15:19:58 compute-1 nova_compute[183403]: 2026-01-26 15:19:58.860 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:19:58 compute-1 nova_compute[183403]: 2026-01-26 15:19:58.860 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:19:58 compute-1 nova_compute[183403]: 2026-01-26 15:19:58.861 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:19:58 compute-1 nova_compute[183403]: 2026-01-26 15:19:58.862 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:19:58 compute-1 nova_compute[183403]: 2026-01-26 15:19:58.862 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'ba8ad607-7343-5d26-9e98-e126b95fbcdc', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:19:58 compute-1 nova_compute[183403]: 2026-01-26 15:19:58.864 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:19:58 compute-1 nova_compute[183403]: 2026-01-26 15:19:58.866 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:19:58 compute-1 nova_compute[183403]: 2026-01-26 15:19:58.869 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:19:58 compute-1 nova_compute[183403]: 2026-01-26 15:19:58.869 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap27dfabe3-07, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:19:58 compute-1 nova_compute[183403]: 2026-01-26 15:19:58.870 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap27dfabe3-07, col_values=(('qos', UUID('9b222775-bea0-4654-8701-b9ae07dc6ab3')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:19:58 compute-1 nova_compute[183403]: 2026-01-26 15:19:58.870 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap27dfabe3-07, col_values=(('external_ids', {'iface-id': '27dfabe3-07b8-4c53-a020-e39d337bc11c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:67:fb:7d', 'vm-uuid': 'de27c520-d8b0-42de-8d97-f48ffcc94a7e'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:19:58 compute-1 NetworkManager[55716]: <info>  [1769440798.8730] manager: (tap27dfabe3-07): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/49)
Jan 26 15:19:58 compute-1 nova_compute[183403]: 2026-01-26 15:19:58.874 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:19:58 compute-1 nova_compute[183403]: 2026-01-26 15:19:58.879 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:19:58 compute-1 nova_compute[183403]: 2026-01-26 15:19:58.880 183407 INFO os_vif [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:67:fb:7d,bridge_name='br-int',has_traffic_filtering=True,id=27dfabe3-07b8-4c53-a020-e39d337bc11c,network=Network(0df777d6-b389-44bc-b166-8208ab926234),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27dfabe3-07')
Jan 26 15:19:59 compute-1 nova_compute[183403]: 2026-01-26 15:19:59.915 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:00 compute-1 nova_compute[183403]: 2026-01-26 15:20:00.811 183407 DEBUG nova.virt.libvirt.driver [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 15:20:00 compute-1 nova_compute[183403]: 2026-01-26 15:20:00.812 183407 DEBUG nova.virt.libvirt.driver [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 15:20:00 compute-1 nova_compute[183403]: 2026-01-26 15:20:00.812 183407 DEBUG nova.virt.libvirt.driver [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] No VIF found with MAC fa:16:3e:67:fb:7d, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Jan 26 15:20:00 compute-1 nova_compute[183403]: 2026-01-26 15:20:00.813 183407 INFO nova.virt.libvirt.driver [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Using config drive
Jan 26 15:20:01 compute-1 nova_compute[183403]: 2026-01-26 15:20:01.336 183407 WARNING neutronclient.v2_0.client [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:20:01 compute-1 nova_compute[183403]: 2026-01-26 15:20:01.657 183407 INFO nova.virt.libvirt.driver [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Creating config drive at /var/lib/nova/instances/de27c520-d8b0-42de-8d97-f48ffcc94a7e/disk.config
Jan 26 15:20:01 compute-1 nova_compute[183403]: 2026-01-26 15:20:01.665 183407 DEBUG oslo_concurrency.processutils [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/de27c520-d8b0-42de-8d97-f48ffcc94a7e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp01scz5u1 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:20:01 compute-1 nova_compute[183403]: 2026-01-26 15:20:01.792 183407 DEBUG oslo_concurrency.processutils [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/de27c520-d8b0-42de-8d97-f48ffcc94a7e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp01scz5u1" returned: 0 in 0.127s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:20:01 compute-1 kernel: tap27dfabe3-07: entered promiscuous mode
Jan 26 15:20:01 compute-1 NetworkManager[55716]: <info>  [1769440801.8799] manager: (tap27dfabe3-07): new Tun device (/org/freedesktop/NetworkManager/Devices/50)
Jan 26 15:20:01 compute-1 ovn_controller[95641]: 2026-01-26T15:20:01Z|00124|binding|INFO|Claiming lport 27dfabe3-07b8-4c53-a020-e39d337bc11c for this chassis.
Jan 26 15:20:01 compute-1 ovn_controller[95641]: 2026-01-26T15:20:01Z|00125|binding|INFO|27dfabe3-07b8-4c53-a020-e39d337bc11c: Claiming fa:16:3e:67:fb:7d 10.100.0.13
Jan 26 15:20:01 compute-1 nova_compute[183403]: 2026-01-26 15:20:01.880 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:01 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:01.891 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:67:fb:7d 10.100.0.13'], port_security=['fa:16:3e:67:fb:7d 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'de27c520-d8b0-42de-8d97-f48ffcc94a7e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0df777d6-b389-44bc-b166-8208ab926234', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ed11f66f0de4f6191def09f65c67624', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e54c8a16-0c91-4c9d-aa33-a10f9396fada', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ffea2f62-0986-47bd-a80c-89f1c9decf3f, chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], logical_port=27dfabe3-07b8-4c53-a020-e39d337bc11c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:20:01 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:01.894 104930 INFO neutron.agent.ovn.metadata.agent [-] Port 27dfabe3-07b8-4c53-a020-e39d337bc11c in datapath 0df777d6-b389-44bc-b166-8208ab926234 bound to our chassis
Jan 26 15:20:01 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:01.896 104930 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0df777d6-b389-44bc-b166-8208ab926234
Jan 26 15:20:01 compute-1 ovn_controller[95641]: 2026-01-26T15:20:01Z|00126|binding|INFO|Setting lport 27dfabe3-07b8-4c53-a020-e39d337bc11c ovn-installed in OVS
Jan 26 15:20:01 compute-1 ovn_controller[95641]: 2026-01-26T15:20:01Z|00127|binding|INFO|Setting lport 27dfabe3-07b8-4c53-a020-e39d337bc11c up in Southbound
Jan 26 15:20:01 compute-1 nova_compute[183403]: 2026-01-26 15:20:01.907 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:01 compute-1 nova_compute[183403]: 2026-01-26 15:20:01.912 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:01 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:01.917 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[f02f0403-f45e-46ac-804e-b2fdb82308ed]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:20:01 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:01.918 104930 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0df777d6-b1 in ovnmeta-0df777d6-b389-44bc-b166-8208ab926234 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Jan 26 15:20:01 compute-1 systemd-udevd[209055]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 15:20:01 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:01.920 203506 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0df777d6-b0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Jan 26 15:20:01 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:01.920 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[749d3682-e8c7-42ff-a14e-0564900dae46]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:20:01 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:01.921 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[a11fad18-dc75-49e9-9ec4-5a95caafd34c]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:20:01 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:01.933 105448 DEBUG oslo.privsep.daemon [-] privsep: reply[703c391f-d9e1-4a48-823e-069fb7ce7018]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:20:01 compute-1 NetworkManager[55716]: <info>  [1769440801.9387] device (tap27dfabe3-07): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 15:20:01 compute-1 NetworkManager[55716]: <info>  [1769440801.9391] device (tap27dfabe3-07): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 15:20:01 compute-1 systemd-machined[154697]: New machine qemu-11-instance-0000000f.
Jan 26 15:20:01 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:01.954 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[d90677f1-f533-49eb-8e51-b1775a3acc4f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:20:01 compute-1 systemd[1]: Started Virtual Machine qemu-11-instance-0000000f.
Jan 26 15:20:01 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:01.990 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[478dcc49-1901-4124-b747-7c0a48946f28]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:20:01 compute-1 NetworkManager[55716]: <info>  [1769440801.9973] manager: (tap0df777d6-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/51)
Jan 26 15:20:01 compute-1 systemd-udevd[209062]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 15:20:01 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:01.997 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[dc8c3247-a4e9-4aba-8528-cd1916d052ee]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:20:02 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:02.036 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[b45fbaac-afe2-4332-b17c-547d74b8cf88]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:20:02 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:02.039 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[a1a37057-962b-461c-95f3-2f05c7f6ea7b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:20:02 compute-1 NetworkManager[55716]: <info>  [1769440802.0750] device (tap0df777d6-b0): carrier: link connected
Jan 26 15:20:02 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:02.084 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[444f786f-e458-4cac-8a80-32bb6f392663]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:20:02 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:02.107 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[f3c79cda-f60c-44c3-baa5-6bc86bef2d0d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0df777d6-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:09:dc:c5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 452467, 'reachable_time': 33453, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209090, 'error': None, 'target': 'ovnmeta-0df777d6-b389-44bc-b166-8208ab926234', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:20:02 compute-1 nova_compute[183403]: 2026-01-26 15:20:02.119 183407 DEBUG nova.compute.manager [req-9aa6a791-58ee-4e9d-9a17-35fab820f807 req-f1a0a9b2-55d5-4607-aa8c-e0991aabb089 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Received event network-vif-plugged-27dfabe3-07b8-4c53-a020-e39d337bc11c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:20:02 compute-1 nova_compute[183403]: 2026-01-26 15:20:02.119 183407 DEBUG oslo_concurrency.lockutils [req-9aa6a791-58ee-4e9d-9a17-35fab820f807 req-f1a0a9b2-55d5-4607-aa8c-e0991aabb089 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "de27c520-d8b0-42de-8d97-f48ffcc94a7e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:20:02 compute-1 nova_compute[183403]: 2026-01-26 15:20:02.120 183407 DEBUG oslo_concurrency.lockutils [req-9aa6a791-58ee-4e9d-9a17-35fab820f807 req-f1a0a9b2-55d5-4607-aa8c-e0991aabb089 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "de27c520-d8b0-42de-8d97-f48ffcc94a7e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:20:02 compute-1 nova_compute[183403]: 2026-01-26 15:20:02.121 183407 DEBUG oslo_concurrency.lockutils [req-9aa6a791-58ee-4e9d-9a17-35fab820f807 req-f1a0a9b2-55d5-4607-aa8c-e0991aabb089 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "de27c520-d8b0-42de-8d97-f48ffcc94a7e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:20:02 compute-1 nova_compute[183403]: 2026-01-26 15:20:02.121 183407 DEBUG nova.compute.manager [req-9aa6a791-58ee-4e9d-9a17-35fab820f807 req-f1a0a9b2-55d5-4607-aa8c-e0991aabb089 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Processing event network-vif-plugged-27dfabe3-07b8-4c53-a020-e39d337bc11c _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 26 15:20:02 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:02.131 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[923d32f0-fe8d-4938-81de-3d410d7c0db0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe09:dcc5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 452467, 'tstamp': 452467}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209091, 'error': None, 'target': 'ovnmeta-0df777d6-b389-44bc-b166-8208ab926234', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:20:02 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:02.157 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[f147112f-2010-4841-9bf5-d45e45b10f2d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0df777d6-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:09:dc:c5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 452467, 'reachable_time': 33453, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 209092, 'error': None, 'target': 'ovnmeta-0df777d6-b389-44bc-b166-8208ab926234', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:20:02 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:02.199 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[0b679076-248e-4f06-8bf5-13520ab259f7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:20:02 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:02.290 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[cd106283-1e3c-41ad-826f-f2cba53a9b2a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:20:02 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:02.291 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0df777d6-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:20:02 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:02.291 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:20:02 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:02.292 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0df777d6-b0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:20:02 compute-1 NetworkManager[55716]: <info>  [1769440802.2951] manager: (tap0df777d6-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/52)
Jan 26 15:20:02 compute-1 kernel: tap0df777d6-b0: entered promiscuous mode
Jan 26 15:20:02 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:02.298 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0df777d6-b0, col_values=(('external_ids', {'iface-id': '33751b84-abc7-465c-a58b-58ca2b0cbc0a'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:20:02 compute-1 ovn_controller[95641]: 2026-01-26T15:20:02Z|00128|binding|INFO|Releasing lport 33751b84-abc7-465c-a58b-58ca2b0cbc0a from this chassis (sb_readonly=0)
Jan 26 15:20:02 compute-1 nova_compute[183403]: 2026-01-26 15:20:02.305 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:02 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:02.311 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[a6294fd5-5a64-4066-8d5a-41d08555e840]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:20:02 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:02.312 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0df777d6-b389-44bc-b166-8208ab926234.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0df777d6-b389-44bc-b166-8208ab926234.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:20:02 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:02.312 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0df777d6-b389-44bc-b166-8208ab926234.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0df777d6-b389-44bc-b166-8208ab926234.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:20:02 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:02.312 104930 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 0df777d6-b389-44bc-b166-8208ab926234 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Jan 26 15:20:02 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:02.313 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0df777d6-b389-44bc-b166-8208ab926234.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0df777d6-b389-44bc-b166-8208ab926234.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:20:02 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:02.313 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[5975323d-e671-4012-b092-9291707c5c17]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:20:02 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:02.314 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0df777d6-b389-44bc-b166-8208ab926234.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0df777d6-b389-44bc-b166-8208ab926234.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:20:02 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:02.315 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[ed373bfe-a37f-4900-bca9-815fb0cfab70]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:20:02 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:02.316 104930 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Jan 26 15:20:02 compute-1 ovn_metadata_agent[104924]: global
Jan 26 15:20:02 compute-1 ovn_metadata_agent[104924]:     log         /dev/log local0 debug
Jan 26 15:20:02 compute-1 ovn_metadata_agent[104924]:     log-tag     haproxy-metadata-proxy-0df777d6-b389-44bc-b166-8208ab926234
Jan 26 15:20:02 compute-1 ovn_metadata_agent[104924]:     user        root
Jan 26 15:20:02 compute-1 ovn_metadata_agent[104924]:     group       root
Jan 26 15:20:02 compute-1 ovn_metadata_agent[104924]:     maxconn     1024
Jan 26 15:20:02 compute-1 ovn_metadata_agent[104924]:     pidfile     /var/lib/neutron/external/pids/0df777d6-b389-44bc-b166-8208ab926234.pid.haproxy
Jan 26 15:20:02 compute-1 ovn_metadata_agent[104924]:     daemon
Jan 26 15:20:02 compute-1 ovn_metadata_agent[104924]: 
Jan 26 15:20:02 compute-1 ovn_metadata_agent[104924]: defaults
Jan 26 15:20:02 compute-1 ovn_metadata_agent[104924]:     log global
Jan 26 15:20:02 compute-1 ovn_metadata_agent[104924]:     mode http
Jan 26 15:20:02 compute-1 ovn_metadata_agent[104924]:     option httplog
Jan 26 15:20:02 compute-1 ovn_metadata_agent[104924]:     option dontlognull
Jan 26 15:20:02 compute-1 ovn_metadata_agent[104924]:     option http-server-close
Jan 26 15:20:02 compute-1 ovn_metadata_agent[104924]:     option forwardfor
Jan 26 15:20:02 compute-1 ovn_metadata_agent[104924]:     retries                 3
Jan 26 15:20:02 compute-1 ovn_metadata_agent[104924]:     timeout http-request    30s
Jan 26 15:20:02 compute-1 ovn_metadata_agent[104924]:     timeout connect         30s
Jan 26 15:20:02 compute-1 ovn_metadata_agent[104924]:     timeout client          32s
Jan 26 15:20:02 compute-1 ovn_metadata_agent[104924]:     timeout server          32s
Jan 26 15:20:02 compute-1 ovn_metadata_agent[104924]:     timeout http-keep-alive 30s
Jan 26 15:20:02 compute-1 ovn_metadata_agent[104924]: 
Jan 26 15:20:02 compute-1 ovn_metadata_agent[104924]: listen listener
Jan 26 15:20:02 compute-1 ovn_metadata_agent[104924]:     bind 169.254.169.254:80
Jan 26 15:20:02 compute-1 ovn_metadata_agent[104924]:     
Jan 26 15:20:02 compute-1 ovn_metadata_agent[104924]:     server metadata /var/lib/neutron/metadata_proxy
Jan 26 15:20:02 compute-1 ovn_metadata_agent[104924]: 
Jan 26 15:20:02 compute-1 ovn_metadata_agent[104924]:     http-request add-header X-OVN-Network-ID 0df777d6-b389-44bc-b166-8208ab926234
Jan 26 15:20:02 compute-1 ovn_metadata_agent[104924]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Jan 26 15:20:02 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:02.317 104930 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0df777d6-b389-44bc-b166-8208ab926234', 'env', 'PROCESS_TAG=haproxy-0df777d6-b389-44bc-b166-8208ab926234', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0df777d6-b389-44bc-b166-8208ab926234.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Jan 26 15:20:02 compute-1 nova_compute[183403]: 2026-01-26 15:20:02.329 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:02 compute-1 nova_compute[183403]: 2026-01-26 15:20:02.421 183407 DEBUG nova.compute.manager [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 26 15:20:02 compute-1 nova_compute[183403]: 2026-01-26 15:20:02.425 183407 DEBUG nova.virt.libvirt.driver [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Jan 26 15:20:02 compute-1 nova_compute[183403]: 2026-01-26 15:20:02.429 183407 INFO nova.virt.libvirt.driver [-] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Instance spawned successfully.
Jan 26 15:20:02 compute-1 nova_compute[183403]: 2026-01-26 15:20:02.429 183407 DEBUG nova.virt.libvirt.driver [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Jan 26 15:20:02 compute-1 podman[209131]: 2026-01-26 15:20:02.735434557 +0000 UTC m=+0.052629846 container create f68d499bda40224e8a02ad284a1e95037c16fd53dc91e51202e1e07477e09312 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-0df777d6-b389-44bc-b166-8208ab926234, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20260120)
Jan 26 15:20:02 compute-1 systemd[1]: Started libpod-conmon-f68d499bda40224e8a02ad284a1e95037c16fd53dc91e51202e1e07477e09312.scope.
Jan 26 15:20:02 compute-1 systemd[1]: Started libcrun container.
Jan 26 15:20:02 compute-1 podman[209131]: 2026-01-26 15:20:02.708231406 +0000 UTC m=+0.025426695 image pull d5bf96c5225682608353c2a38183b39c74c7c48343b54a579b3b6f3d81996637 38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Jan 26 15:20:02 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a9293a14416c9512048c96c12bad82b0e9962331635a5959f83e15160ddd64a6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 15:20:02 compute-1 podman[209131]: 2026-01-26 15:20:02.828912338 +0000 UTC m=+0.146107677 container init f68d499bda40224e8a02ad284a1e95037c16fd53dc91e51202e1e07477e09312 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-0df777d6-b389-44bc-b166-8208ab926234, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20260120)
Jan 26 15:20:02 compute-1 podman[209131]: 2026-01-26 15:20:02.839809819 +0000 UTC m=+0.157005138 container start f68d499bda40224e8a02ad284a1e95037c16fd53dc91e51202e1e07477e09312 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-0df777d6-b389-44bc-b166-8208ab926234, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.build-date=20260120)
Jan 26 15:20:02 compute-1 neutron-haproxy-ovnmeta-0df777d6-b389-44bc-b166-8208ab926234[209144]: [NOTICE]   (209148) : New worker (209150) forked
Jan 26 15:20:02 compute-1 neutron-haproxy-ovnmeta-0df777d6-b389-44bc-b166-8208ab926234[209144]: [NOTICE]   (209148) : Loading success.
Jan 26 15:20:02 compute-1 nova_compute[183403]: 2026-01-26 15:20:02.939 183407 DEBUG nova.virt.libvirt.driver [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:20:02 compute-1 nova_compute[183403]: 2026-01-26 15:20:02.940 183407 DEBUG nova.virt.libvirt.driver [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:20:02 compute-1 nova_compute[183403]: 2026-01-26 15:20:02.940 183407 DEBUG nova.virt.libvirt.driver [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:20:02 compute-1 nova_compute[183403]: 2026-01-26 15:20:02.941 183407 DEBUG nova.virt.libvirt.driver [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:20:02 compute-1 nova_compute[183403]: 2026-01-26 15:20:02.941 183407 DEBUG nova.virt.libvirt.driver [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:20:02 compute-1 nova_compute[183403]: 2026-01-26 15:20:02.942 183407 DEBUG nova.virt.libvirt.driver [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:20:03 compute-1 nova_compute[183403]: 2026-01-26 15:20:03.453 183407 INFO nova.compute.manager [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Took 10.06 seconds to spawn the instance on the hypervisor.
Jan 26 15:20:03 compute-1 nova_compute[183403]: 2026-01-26 15:20:03.453 183407 DEBUG nova.compute.manager [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Jan 26 15:20:03 compute-1 nova_compute[183403]: 2026-01-26 15:20:03.873 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:03 compute-1 nova_compute[183403]: 2026-01-26 15:20:03.995 183407 INFO nova.compute.manager [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Took 15.31 seconds to build instance.
Jan 26 15:20:04 compute-1 nova_compute[183403]: 2026-01-26 15:20:04.187 183407 DEBUG nova.compute.manager [req-e797ad26-2100-4471-a51b-1f228b1de15f req-81998ad6-dea7-4377-a253-d5313d940423 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Received event network-vif-plugged-27dfabe3-07b8-4c53-a020-e39d337bc11c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:20:04 compute-1 nova_compute[183403]: 2026-01-26 15:20:04.187 183407 DEBUG oslo_concurrency.lockutils [req-e797ad26-2100-4471-a51b-1f228b1de15f req-81998ad6-dea7-4377-a253-d5313d940423 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "de27c520-d8b0-42de-8d97-f48ffcc94a7e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:20:04 compute-1 nova_compute[183403]: 2026-01-26 15:20:04.188 183407 DEBUG oslo_concurrency.lockutils [req-e797ad26-2100-4471-a51b-1f228b1de15f req-81998ad6-dea7-4377-a253-d5313d940423 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "de27c520-d8b0-42de-8d97-f48ffcc94a7e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:20:04 compute-1 nova_compute[183403]: 2026-01-26 15:20:04.188 183407 DEBUG oslo_concurrency.lockutils [req-e797ad26-2100-4471-a51b-1f228b1de15f req-81998ad6-dea7-4377-a253-d5313d940423 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "de27c520-d8b0-42de-8d97-f48ffcc94a7e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:20:04 compute-1 nova_compute[183403]: 2026-01-26 15:20:04.189 183407 DEBUG nova.compute.manager [req-e797ad26-2100-4471-a51b-1f228b1de15f req-81998ad6-dea7-4377-a253-d5313d940423 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] No waiting events found dispatching network-vif-plugged-27dfabe3-07b8-4c53-a020-e39d337bc11c pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:20:04 compute-1 nova_compute[183403]: 2026-01-26 15:20:04.189 183407 WARNING nova.compute.manager [req-e797ad26-2100-4471-a51b-1f228b1de15f req-81998ad6-dea7-4377-a253-d5313d940423 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Received unexpected event network-vif-plugged-27dfabe3-07b8-4c53-a020-e39d337bc11c for instance with vm_state active and task_state None.
Jan 26 15:20:04 compute-1 nova_compute[183403]: 2026-01-26 15:20:04.501 183407 DEBUG oslo_concurrency.lockutils [None req-858e31db-2733-4704-879f-ed87746ec6b5 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Lock "de27c520-d8b0-42de-8d97-f48ffcc94a7e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.833s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:20:04 compute-1 nova_compute[183403]: 2026-01-26 15:20:04.917 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:05 compute-1 podman[192725]: time="2026-01-26T15:20:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:20:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:20:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16574 "" "Go-http-client/1.1"
Jan 26 15:20:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:20:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2652 "" "Go-http-client/1.1"
Jan 26 15:20:05 compute-1 podman[209159]: 2026-01-26 15:20:05.886068491 +0000 UTC m=+0.061938437 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 15:20:05 compute-1 podman[209160]: 2026-01-26 15:20:05.922438958 +0000 UTC m=+0.086660070 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-type=git, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container)
Jan 26 15:20:08 compute-1 nova_compute[183403]: 2026-01-26 15:20:08.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:20:08 compute-1 nova_compute[183403]: 2026-01-26 15:20:08.876 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:09 compute-1 nova_compute[183403]: 2026-01-26 15:20:09.919 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:10 compute-1 sshd-session[209204]: Invalid user sol from 80.94.92.171 port 42062
Jan 26 15:20:10 compute-1 sshd-session[209204]: Connection closed by invalid user sol 80.94.92.171 port 42062 [preauth]
Jan 26 15:20:10 compute-1 nova_compute[183403]: 2026-01-26 15:20:10.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:20:10 compute-1 nova_compute[183403]: 2026-01-26 15:20:10.577 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:20:11 compute-1 nova_compute[183403]: 2026-01-26 15:20:11.091 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:20:11 compute-1 nova_compute[183403]: 2026-01-26 15:20:11.093 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:20:11 compute-1 nova_compute[183403]: 2026-01-26 15:20:11.094 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:20:11 compute-1 nova_compute[183403]: 2026-01-26 15:20:11.095 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:20:12 compute-1 nova_compute[183403]: 2026-01-26 15:20:12.141 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/de27c520-d8b0-42de-8d97-f48ffcc94a7e/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:20:12 compute-1 nova_compute[183403]: 2026-01-26 15:20:12.236 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/de27c520-d8b0-42de-8d97-f48ffcc94a7e/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:20:12 compute-1 nova_compute[183403]: 2026-01-26 15:20:12.238 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/de27c520-d8b0-42de-8d97-f48ffcc94a7e/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:20:12 compute-1 nova_compute[183403]: 2026-01-26 15:20:12.306 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/de27c520-d8b0-42de-8d97-f48ffcc94a7e/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:20:12 compute-1 nova_compute[183403]: 2026-01-26 15:20:12.481 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:20:12 compute-1 nova_compute[183403]: 2026-01-26 15:20:12.482 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:20:12 compute-1 nova_compute[183403]: 2026-01-26 15:20:12.503 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.021s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:20:12 compute-1 nova_compute[183403]: 2026-01-26 15:20:12.504 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5649MB free_disk=73.14399719238281GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:20:12 compute-1 nova_compute[183403]: 2026-01-26 15:20:12.505 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:20:12 compute-1 nova_compute[183403]: 2026-01-26 15:20:12.505 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:20:13 compute-1 nova_compute[183403]: 2026-01-26 15:20:13.549 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Instance de27c520-d8b0-42de-8d97-f48ffcc94a7e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 15:20:13 compute-1 nova_compute[183403]: 2026-01-26 15:20:13.550 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:20:13 compute-1 nova_compute[183403]: 2026-01-26 15:20:13.550 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:20:12 up  1:15,  0 user,  load average: 0.36, 0.22, 0.28\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_3ed11f66f0de4f6191def09f65c67624': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:20:13 compute-1 nova_compute[183403]: 2026-01-26 15:20:13.582 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:20:13 compute-1 nova_compute[183403]: 2026-01-26 15:20:13.878 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:13 compute-1 podman[209227]: 2026-01-26 15:20:13.901468042 +0000 UTC m=+0.070472975 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260120, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 26 15:20:13 compute-1 podman[209226]: 2026-01-26 15:20:13.925192433 +0000 UTC m=+0.097491878 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 15:20:14 compute-1 nova_compute[183403]: 2026-01-26 15:20:14.089 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:20:14 compute-1 ovn_controller[95641]: 2026-01-26T15:20:14Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:67:fb:7d 10.100.0.13
Jan 26 15:20:14 compute-1 ovn_controller[95641]: 2026-01-26T15:20:14Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:67:fb:7d 10.100.0.13
Jan 26 15:20:14 compute-1 nova_compute[183403]: 2026-01-26 15:20:14.599 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:20:14 compute-1 nova_compute[183403]: 2026-01-26 15:20:14.600 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.095s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:20:14 compute-1 nova_compute[183403]: 2026-01-26 15:20:14.919 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:16 compute-1 nova_compute[183403]: 2026-01-26 15:20:16.534 183407 DEBUG nova.virt.libvirt.driver [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: f6352ae1-45e0-4f4e-9df8-f441325b4c16] Creating tmpfile /var/lib/nova/instances/tmpv6lweu7c to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Jan 26 15:20:16 compute-1 nova_compute[183403]: 2026-01-26 15:20:16.535 183407 WARNING neutronclient.v2_0.client [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:20:16 compute-1 nova_compute[183403]: 2026-01-26 15:20:16.539 183407 DEBUG nova.compute.manager [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpv6lweu7c',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9090
Jan 26 15:20:16 compute-1 nova_compute[183403]: 2026-01-26 15:20:16.600 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:20:17 compute-1 nova_compute[183403]: 2026-01-26 15:20:17.111 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:20:17 compute-1 nova_compute[183403]: 2026-01-26 15:20:17.112 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:20:17 compute-1 nova_compute[183403]: 2026-01-26 15:20:17.112 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:20:17 compute-1 nova_compute[183403]: 2026-01-26 15:20:17.112 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 15:20:17 compute-1 nova_compute[183403]: 2026-01-26 15:20:17.577 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:20:17 compute-1 nova_compute[183403]: 2026-01-26 15:20:17.578 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:20:18 compute-1 nova_compute[183403]: 2026-01-26 15:20:18.562 183407 WARNING neutronclient.v2_0.client [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:20:18 compute-1 nova_compute[183403]: 2026-01-26 15:20:18.880 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:19 compute-1 openstack_network_exporter[195610]: ERROR   15:20:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:20:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:20:19 compute-1 openstack_network_exporter[195610]: ERROR   15:20:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:20:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:20:19 compute-1 nova_compute[183403]: 2026-01-26 15:20:19.922 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:22 compute-1 nova_compute[183403]: 2026-01-26 15:20:22.548 183407 DEBUG nova.compute.manager [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpv6lweu7c',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='f6352ae1-45e0-4f4e-9df8-f441325b4c16',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9315
Jan 26 15:20:23 compute-1 nova_compute[183403]: 2026-01-26 15:20:23.582 183407 DEBUG oslo_concurrency.lockutils [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "refresh_cache-f6352ae1-45e0-4f4e-9df8-f441325b4c16" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 15:20:23 compute-1 nova_compute[183403]: 2026-01-26 15:20:23.582 183407 DEBUG oslo_concurrency.lockutils [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquired lock "refresh_cache-f6352ae1-45e0-4f4e-9df8-f441325b4c16" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 15:20:23 compute-1 nova_compute[183403]: 2026-01-26 15:20:23.582 183407 DEBUG nova.network.neutron [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: f6352ae1-45e0-4f4e-9df8-f441325b4c16] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 15:20:23 compute-1 nova_compute[183403]: 2026-01-26 15:20:23.883 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:24 compute-1 nova_compute[183403]: 2026-01-26 15:20:24.089 183407 WARNING neutronclient.v2_0.client [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:20:24 compute-1 nova_compute[183403]: 2026-01-26 15:20:24.889 183407 WARNING neutronclient.v2_0.client [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:20:24 compute-1 nova_compute[183403]: 2026-01-26 15:20:24.923 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:25 compute-1 nova_compute[183403]: 2026-01-26 15:20:25.110 183407 DEBUG nova.network.neutron [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: f6352ae1-45e0-4f4e-9df8-f441325b4c16] Updating instance_info_cache with network_info: [{"id": "608de942-8c66-4087-931c-0955fbd6c10b", "address": "fa:16:3e:78:e5:dd", "network": {"id": "0df777d6-b389-44bc-b166-8208ab926234", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1259048565-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2d0174cdadae4803981796a2cea457d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap608de942-8c", "ovs_interfaceid": "608de942-8c66-4087-931c-0955fbd6c10b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:20:25 compute-1 nova_compute[183403]: 2026-01-26 15:20:25.616 183407 DEBUG oslo_concurrency.lockutils [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Releasing lock "refresh_cache-f6352ae1-45e0-4f4e-9df8-f441325b4c16" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 15:20:25 compute-1 nova_compute[183403]: 2026-01-26 15:20:25.634 183407 DEBUG nova.virt.libvirt.driver [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: f6352ae1-45e0-4f4e-9df8-f441325b4c16] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpv6lweu7c',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='f6352ae1-45e0-4f4e-9df8-f441325b4c16',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Jan 26 15:20:25 compute-1 nova_compute[183403]: 2026-01-26 15:20:25.635 183407 DEBUG nova.virt.libvirt.driver [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: f6352ae1-45e0-4f4e-9df8-f441325b4c16] Creating instance directory: /var/lib/nova/instances/f6352ae1-45e0-4f4e-9df8-f441325b4c16 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Jan 26 15:20:25 compute-1 nova_compute[183403]: 2026-01-26 15:20:25.636 183407 DEBUG nova.virt.libvirt.driver [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: f6352ae1-45e0-4f4e-9df8-f441325b4c16] Creating disk.info with the contents: {'/var/lib/nova/instances/f6352ae1-45e0-4f4e-9df8-f441325b4c16/disk': 'qcow2', '/var/lib/nova/instances/f6352ae1-45e0-4f4e-9df8-f441325b4c16/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Jan 26 15:20:25 compute-1 nova_compute[183403]: 2026-01-26 15:20:25.637 183407 DEBUG nova.virt.libvirt.driver [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: f6352ae1-45e0-4f4e-9df8-f441325b4c16] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Jan 26 15:20:25 compute-1 nova_compute[183403]: 2026-01-26 15:20:25.637 183407 DEBUG nova.objects.instance [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lazy-loading 'trusted_certs' on Instance uuid f6352ae1-45e0-4f4e-9df8-f441325b4c16 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:20:26 compute-1 nova_compute[183403]: 2026-01-26 15:20:26.150 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:20:26 compute-1 nova_compute[183403]: 2026-01-26 15:20:26.157 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:20:26 compute-1 nova_compute[183403]: 2026-01-26 15:20:26.160 183407 DEBUG oslo_concurrency.processutils [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:20:26 compute-1 nova_compute[183403]: 2026-01-26 15:20:26.231 183407 DEBUG oslo_concurrency.processutils [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:20:26 compute-1 nova_compute[183403]: 2026-01-26 15:20:26.232 183407 DEBUG oslo_concurrency.lockutils [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:20:26 compute-1 nova_compute[183403]: 2026-01-26 15:20:26.233 183407 DEBUG oslo_concurrency.lockutils [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:20:26 compute-1 nova_compute[183403]: 2026-01-26 15:20:26.233 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:20:26 compute-1 nova_compute[183403]: 2026-01-26 15:20:26.238 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:20:26 compute-1 nova_compute[183403]: 2026-01-26 15:20:26.238 183407 DEBUG oslo_concurrency.processutils [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:20:26 compute-1 nova_compute[183403]: 2026-01-26 15:20:26.292 183407 DEBUG oslo_concurrency.processutils [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:20:26 compute-1 nova_compute[183403]: 2026-01-26 15:20:26.292 183407 DEBUG oslo_concurrency.processutils [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0,backing_fmt=raw /var/lib/nova/instances/f6352ae1-45e0-4f4e-9df8-f441325b4c16/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:20:26 compute-1 nova_compute[183403]: 2026-01-26 15:20:26.337 183407 DEBUG oslo_concurrency.processutils [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0,backing_fmt=raw /var/lib/nova/instances/f6352ae1-45e0-4f4e-9df8-f441325b4c16/disk 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:20:26 compute-1 nova_compute[183403]: 2026-01-26 15:20:26.338 183407 DEBUG oslo_concurrency.lockutils [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.105s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:20:26 compute-1 nova_compute[183403]: 2026-01-26 15:20:26.338 183407 DEBUG oslo_concurrency.processutils [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:20:26 compute-1 nova_compute[183403]: 2026-01-26 15:20:26.415 183407 DEBUG oslo_concurrency.processutils [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:20:26 compute-1 nova_compute[183403]: 2026-01-26 15:20:26.416 183407 DEBUG nova.virt.disk.api [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Checking if we can resize image /var/lib/nova/instances/f6352ae1-45e0-4f4e-9df8-f441325b4c16/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 26 15:20:26 compute-1 nova_compute[183403]: 2026-01-26 15:20:26.417 183407 DEBUG oslo_concurrency.processutils [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f6352ae1-45e0-4f4e-9df8-f441325b4c16/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:20:26 compute-1 nova_compute[183403]: 2026-01-26 15:20:26.467 183407 DEBUG oslo_concurrency.processutils [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f6352ae1-45e0-4f4e-9df8-f441325b4c16/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:20:26 compute-1 nova_compute[183403]: 2026-01-26 15:20:26.468 183407 DEBUG nova.virt.disk.api [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Cannot resize image /var/lib/nova/instances/f6352ae1-45e0-4f4e-9df8-f441325b4c16/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 26 15:20:26 compute-1 nova_compute[183403]: 2026-01-26 15:20:26.469 183407 DEBUG nova.objects.instance [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lazy-loading 'migration_context' on Instance uuid f6352ae1-45e0-4f4e-9df8-f441325b4c16 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:20:26 compute-1 nova_compute[183403]: 2026-01-26 15:20:26.977 183407 DEBUG nova.objects.base [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Object Instance<f6352ae1-45e0-4f4e-9df8-f441325b4c16> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Jan 26 15:20:26 compute-1 nova_compute[183403]: 2026-01-26 15:20:26.978 183407 DEBUG oslo_concurrency.processutils [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/f6352ae1-45e0-4f4e-9df8-f441325b4c16/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:20:27 compute-1 nova_compute[183403]: 2026-01-26 15:20:27.004 183407 DEBUG oslo_concurrency.processutils [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/f6352ae1-45e0-4f4e-9df8-f441325b4c16/disk.config 497664" returned: 0 in 0.026s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:20:27 compute-1 nova_compute[183403]: 2026-01-26 15:20:27.005 183407 DEBUG nova.virt.libvirt.driver [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: f6352ae1-45e0-4f4e-9df8-f441325b4c16] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Jan 26 15:20:27 compute-1 nova_compute[183403]: 2026-01-26 15:20:27.006 183407 DEBUG nova.virt.libvirt.vif [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-26T15:19:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1014049427',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1014049427',id=14,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T15:19:43Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3ed11f66f0de4f6191def09f65c67624',ramdisk_id='',reservation_id='r-mvifsctc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1844876463',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1844876463-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T15:19:43Z,user_data=None,user_id='eabb3af6e41e4d9e883fc43bd03679db',uuid=f6352ae1-45e0-4f4e-9df8-f441325b4c16,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "608de942-8c66-4087-931c-0955fbd6c10b", "address": "fa:16:3e:78:e5:dd", "network": {"id": "0df777d6-b389-44bc-b166-8208ab926234", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1259048565-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2d0174cdadae4803981796a2cea457d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap608de942-8c", "ovs_interfaceid": "608de942-8c66-4087-931c-0955fbd6c10b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 26 15:20:27 compute-1 nova_compute[183403]: 2026-01-26 15:20:27.007 183407 DEBUG nova.network.os_vif_util [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Converting VIF {"id": "608de942-8c66-4087-931c-0955fbd6c10b", "address": "fa:16:3e:78:e5:dd", "network": {"id": "0df777d6-b389-44bc-b166-8208ab926234", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1259048565-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2d0174cdadae4803981796a2cea457d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap608de942-8c", "ovs_interfaceid": "608de942-8c66-4087-931c-0955fbd6c10b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:20:27 compute-1 nova_compute[183403]: 2026-01-26 15:20:27.008 183407 DEBUG nova.network.os_vif_util [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:78:e5:dd,bridge_name='br-int',has_traffic_filtering=True,id=608de942-8c66-4087-931c-0955fbd6c10b,network=Network(0df777d6-b389-44bc-b166-8208ab926234),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap608de942-8c') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:20:27 compute-1 nova_compute[183403]: 2026-01-26 15:20:27.008 183407 DEBUG os_vif [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:78:e5:dd,bridge_name='br-int',has_traffic_filtering=True,id=608de942-8c66-4087-931c-0955fbd6c10b,network=Network(0df777d6-b389-44bc-b166-8208ab926234),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap608de942-8c') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 26 15:20:27 compute-1 nova_compute[183403]: 2026-01-26 15:20:27.009 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:27 compute-1 nova_compute[183403]: 2026-01-26 15:20:27.009 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:20:27 compute-1 nova_compute[183403]: 2026-01-26 15:20:27.010 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:20:27 compute-1 nova_compute[183403]: 2026-01-26 15:20:27.011 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:27 compute-1 nova_compute[183403]: 2026-01-26 15:20:27.011 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '79664044-03ef-517c-9e66-663515b73552', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:20:27 compute-1 nova_compute[183403]: 2026-01-26 15:20:27.013 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:27 compute-1 nova_compute[183403]: 2026-01-26 15:20:27.016 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:20:27 compute-1 nova_compute[183403]: 2026-01-26 15:20:27.021 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:27 compute-1 nova_compute[183403]: 2026-01-26 15:20:27.021 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap608de942-8c, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:20:27 compute-1 nova_compute[183403]: 2026-01-26 15:20:27.022 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap608de942-8c, col_values=(('qos', UUID('511ac969-cce1-48aa-b337-369e40dcb0a7')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:20:27 compute-1 nova_compute[183403]: 2026-01-26 15:20:27.022 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap608de942-8c, col_values=(('external_ids', {'iface-id': '608de942-8c66-4087-931c-0955fbd6c10b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:78:e5:dd', 'vm-uuid': 'f6352ae1-45e0-4f4e-9df8-f441325b4c16'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:20:27 compute-1 nova_compute[183403]: 2026-01-26 15:20:27.023 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:27 compute-1 NetworkManager[55716]: <info>  [1769440827.0248] manager: (tap608de942-8c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/53)
Jan 26 15:20:27 compute-1 nova_compute[183403]: 2026-01-26 15:20:27.026 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:20:27 compute-1 nova_compute[183403]: 2026-01-26 15:20:27.031 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:27 compute-1 nova_compute[183403]: 2026-01-26 15:20:27.031 183407 INFO os_vif [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:78:e5:dd,bridge_name='br-int',has_traffic_filtering=True,id=608de942-8c66-4087-931c-0955fbd6c10b,network=Network(0df777d6-b389-44bc-b166-8208ab926234),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap608de942-8c')
Jan 26 15:20:27 compute-1 nova_compute[183403]: 2026-01-26 15:20:27.032 183407 DEBUG nova.virt.libvirt.driver [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Jan 26 15:20:27 compute-1 nova_compute[183403]: 2026-01-26 15:20:27.032 183407 DEBUG nova.compute.manager [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpv6lweu7c',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='f6352ae1-45e0-4f4e-9df8-f441325b4c16',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9381
Jan 26 15:20:27 compute-1 nova_compute[183403]: 2026-01-26 15:20:27.033 183407 WARNING neutronclient.v2_0.client [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:20:27 compute-1 nova_compute[183403]: 2026-01-26 15:20:27.563 183407 WARNING neutronclient.v2_0.client [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:20:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:29.054 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:20:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:29.055 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:20:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:29.056 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:20:29 compute-1 nova_compute[183403]: 2026-01-26 15:20:29.689 183407 DEBUG nova.network.neutron [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: f6352ae1-45e0-4f4e-9df8-f441325b4c16] Port 608de942-8c66-4087-931c-0955fbd6c10b updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Jan 26 15:20:29 compute-1 nova_compute[183403]: 2026-01-26 15:20:29.709 183407 DEBUG nova.compute.manager [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpv6lweu7c',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='f6352ae1-45e0-4f4e-9df8-f441325b4c16',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9447
Jan 26 15:20:29 compute-1 nova_compute[183403]: 2026-01-26 15:20:29.927 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:31 compute-1 ovn_controller[95641]: 2026-01-26T15:20:31Z|00129|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Jan 26 15:20:32 compute-1 nova_compute[183403]: 2026-01-26 15:20:32.024 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:32 compute-1 NetworkManager[55716]: <info>  [1769440832.2475] manager: (tap608de942-8c): new Tun device (/org/freedesktop/NetworkManager/Devices/54)
Jan 26 15:20:32 compute-1 kernel: tap608de942-8c: entered promiscuous mode
Jan 26 15:20:32 compute-1 ovn_controller[95641]: 2026-01-26T15:20:32Z|00130|binding|INFO|Claiming lport 608de942-8c66-4087-931c-0955fbd6c10b for this additional chassis.
Jan 26 15:20:32 compute-1 nova_compute[183403]: 2026-01-26 15:20:32.250 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:32 compute-1 ovn_controller[95641]: 2026-01-26T15:20:32Z|00131|binding|INFO|608de942-8c66-4087-931c-0955fbd6c10b: Claiming fa:16:3e:78:e5:dd 10.100.0.6
Jan 26 15:20:32 compute-1 nova_compute[183403]: 2026-01-26 15:20:32.266 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:32 compute-1 ovn_controller[95641]: 2026-01-26T15:20:32Z|00132|binding|INFO|Setting lport 608de942-8c66-4087-931c-0955fbd6c10b ovn-installed in OVS
Jan 26 15:20:32 compute-1 nova_compute[183403]: 2026-01-26 15:20:32.268 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:32 compute-1 nova_compute[183403]: 2026-01-26 15:20:32.269 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:32 compute-1 systemd-udevd[209305]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 15:20:32 compute-1 systemd-machined[154697]: New machine qemu-12-instance-0000000e.
Jan 26 15:20:32 compute-1 NetworkManager[55716]: <info>  [1769440832.3086] device (tap608de942-8c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 15:20:32 compute-1 NetworkManager[55716]: <info>  [1769440832.3102] device (tap608de942-8c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 15:20:32 compute-1 systemd[1]: Started Virtual Machine qemu-12-instance-0000000e.
Jan 26 15:20:32 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:32.356 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:78:e5:dd 10.100.0.6'], port_security=['fa:16:3e:78:e5:dd 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'f6352ae1-45e0-4f4e-9df8-f441325b4c16', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0df777d6-b389-44bc-b166-8208ab926234', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ed11f66f0de4f6191def09f65c67624', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'e54c8a16-0c91-4c9d-aa33-a10f9396fada', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ffea2f62-0986-47bd-a80c-89f1c9decf3f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=608de942-8c66-4087-931c-0955fbd6c10b) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:20:32 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:32.357 104930 INFO neutron.agent.ovn.metadata.agent [-] Port 608de942-8c66-4087-931c-0955fbd6c10b in datapath 0df777d6-b389-44bc-b166-8208ab926234 unbound from our chassis
Jan 26 15:20:32 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:32.358 104930 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0df777d6-b389-44bc-b166-8208ab926234
Jan 26 15:20:32 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:32.383 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[3273a6bc-7988-467d-8357-f0fa3d7c8e7a]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:20:32 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:32.419 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[daed74b8-14c1-4aed-b274-a01042cd77a1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:20:32 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:32.422 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[51858361-3237-4f30-a8ee-6e121f1939f9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:20:32 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:32.455 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[8dd9e23a-58c4-4507-a271-1c12f35b63a6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:20:32 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:32.477 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[243dae1d-de30-4e8a-aa2e-ca554e161ac2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0df777d6-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:09:dc:c5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 452467, 'reachable_time': 33453, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209320, 'error': None, 'target': 'ovnmeta-0df777d6-b389-44bc-b166-8208ab926234', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:20:32 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:32.494 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[88a28fc8-c514-4b7d-afe5-39762a33a010]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap0df777d6-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 452484, 'tstamp': 452484}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209321, 'error': None, 'target': 'ovnmeta-0df777d6-b389-44bc-b166-8208ab926234', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap0df777d6-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 452488, 'tstamp': 452488}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209321, 'error': None, 'target': 'ovnmeta-0df777d6-b389-44bc-b166-8208ab926234', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:20:32 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:32.495 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0df777d6-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:20:32 compute-1 nova_compute[183403]: 2026-01-26 15:20:32.497 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:32 compute-1 nova_compute[183403]: 2026-01-26 15:20:32.498 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:32 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:32.498 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0df777d6-b0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:20:32 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:32.498 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:20:32 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:32.498 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0df777d6-b0, col_values=(('external_ids', {'iface-id': '33751b84-abc7-465c-a58b-58ca2b0cbc0a'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:20:32 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:32.498 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:20:32 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:32.500 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[a50fa558-c222-4ea0-b7bc-61a7d73ff09d]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-0df777d6-b389-44bc-b166-8208ab926234\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/0df777d6-b389-44bc-b166-8208ab926234.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 0df777d6-b389-44bc-b166-8208ab926234\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:20:34 compute-1 nova_compute[183403]: 2026-01-26 15:20:34.927 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:35 compute-1 ovn_controller[95641]: 2026-01-26T15:20:35Z|00133|binding|INFO|Claiming lport 608de942-8c66-4087-931c-0955fbd6c10b for this chassis.
Jan 26 15:20:35 compute-1 ovn_controller[95641]: 2026-01-26T15:20:35Z|00134|binding|INFO|608de942-8c66-4087-931c-0955fbd6c10b: Claiming fa:16:3e:78:e5:dd 10.100.0.6
Jan 26 15:20:35 compute-1 ovn_controller[95641]: 2026-01-26T15:20:35Z|00135|binding|INFO|Setting lport 608de942-8c66-4087-931c-0955fbd6c10b up in Southbound
Jan 26 15:20:35 compute-1 podman[192725]: time="2026-01-26T15:20:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:20:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:20:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16574 "" "Go-http-client/1.1"
Jan 26 15:20:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:20:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2653 "" "Go-http-client/1.1"
Jan 26 15:20:36 compute-1 nova_compute[183403]: 2026-01-26 15:20:36.399 183407 INFO nova.compute.manager [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: f6352ae1-45e0-4f4e-9df8-f441325b4c16] Post operation of migration started
Jan 26 15:20:36 compute-1 nova_compute[183403]: 2026-01-26 15:20:36.400 183407 WARNING neutronclient.v2_0.client [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:20:36 compute-1 nova_compute[183403]: 2026-01-26 15:20:36.580 183407 WARNING neutronclient.v2_0.client [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:20:36 compute-1 nova_compute[183403]: 2026-01-26 15:20:36.580 183407 WARNING neutronclient.v2_0.client [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:20:36 compute-1 nova_compute[183403]: 2026-01-26 15:20:36.659 183407 DEBUG oslo_concurrency.lockutils [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "refresh_cache-f6352ae1-45e0-4f4e-9df8-f441325b4c16" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 15:20:36 compute-1 nova_compute[183403]: 2026-01-26 15:20:36.659 183407 DEBUG oslo_concurrency.lockutils [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquired lock "refresh_cache-f6352ae1-45e0-4f4e-9df8-f441325b4c16" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 15:20:36 compute-1 nova_compute[183403]: 2026-01-26 15:20:36.660 183407 DEBUG nova.network.neutron [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: f6352ae1-45e0-4f4e-9df8-f441325b4c16] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 15:20:36 compute-1 podman[209343]: 2026-01-26 15:20:36.922083705 +0000 UTC m=+0.085882930 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 15:20:36 compute-1 podman[209344]: 2026-01-26 15:20:36.931098 +0000 UTC m=+0.093836717 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, vcs-type=git, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=)
Jan 26 15:20:37 compute-1 nova_compute[183403]: 2026-01-26 15:20:37.028 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:37 compute-1 nova_compute[183403]: 2026-01-26 15:20:37.166 183407 WARNING neutronclient.v2_0.client [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:20:37 compute-1 nova_compute[183403]: 2026-01-26 15:20:37.612 183407 WARNING neutronclient.v2_0.client [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:20:37 compute-1 nova_compute[183403]: 2026-01-26 15:20:37.820 183407 DEBUG nova.network.neutron [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: f6352ae1-45e0-4f4e-9df8-f441325b4c16] Updating instance_info_cache with network_info: [{"id": "608de942-8c66-4087-931c-0955fbd6c10b", "address": "fa:16:3e:78:e5:dd", "network": {"id": "0df777d6-b389-44bc-b166-8208ab926234", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1259048565-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2d0174cdadae4803981796a2cea457d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap608de942-8c", "ovs_interfaceid": "608de942-8c66-4087-931c-0955fbd6c10b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:20:38 compute-1 nova_compute[183403]: 2026-01-26 15:20:38.326 183407 DEBUG oslo_concurrency.lockutils [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Releasing lock "refresh_cache-f6352ae1-45e0-4f4e-9df8-f441325b4c16" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 15:20:38 compute-1 nova_compute[183403]: 2026-01-26 15:20:38.856 183407 DEBUG oslo_concurrency.lockutils [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:20:38 compute-1 nova_compute[183403]: 2026-01-26 15:20:38.856 183407 DEBUG oslo_concurrency.lockutils [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:20:38 compute-1 nova_compute[183403]: 2026-01-26 15:20:38.857 183407 DEBUG oslo_concurrency.lockutils [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:20:38 compute-1 nova_compute[183403]: 2026-01-26 15:20:38.862 183407 INFO nova.virt.libvirt.driver [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: f6352ae1-45e0-4f4e-9df8-f441325b4c16] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Jan 26 15:20:38 compute-1 virtqemud[183290]: Domain id=12 name='instance-0000000e' uuid=f6352ae1-45e0-4f4e-9df8-f441325b4c16 is tainted: custom-monitor
Jan 26 15:20:39 compute-1 nova_compute[183403]: 2026-01-26 15:20:39.871 183407 INFO nova.virt.libvirt.driver [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: f6352ae1-45e0-4f4e-9df8-f441325b4c16] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Jan 26 15:20:39 compute-1 nova_compute[183403]: 2026-01-26 15:20:39.931 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:40 compute-1 nova_compute[183403]: 2026-01-26 15:20:40.876 183407 INFO nova.virt.libvirt.driver [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: f6352ae1-45e0-4f4e-9df8-f441325b4c16] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Jan 26 15:20:40 compute-1 nova_compute[183403]: 2026-01-26 15:20:40.880 183407 DEBUG nova.compute.manager [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: f6352ae1-45e0-4f4e-9df8-f441325b4c16] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Jan 26 15:20:41 compute-1 nova_compute[183403]: 2026-01-26 15:20:41.392 183407 DEBUG nova.objects.instance [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: f6352ae1-45e0-4f4e-9df8-f441325b4c16] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Jan 26 15:20:42 compute-1 nova_compute[183403]: 2026-01-26 15:20:42.030 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:42 compute-1 nova_compute[183403]: 2026-01-26 15:20:42.418 183407 WARNING neutronclient.v2_0.client [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:20:42 compute-1 nova_compute[183403]: 2026-01-26 15:20:42.581 183407 WARNING neutronclient.v2_0.client [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:20:42 compute-1 nova_compute[183403]: 2026-01-26 15:20:42.582 183407 WARNING neutronclient.v2_0.client [None req-cbc65f65-5de5-4bbc-a098-04d1c15a50d9 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:20:44 compute-1 podman[209390]: 2026-01-26 15:20:44.899731491 +0000 UTC m=+0.073072906 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260120, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 15:20:44 compute-1 nova_compute[183403]: 2026-01-26 15:20:44.933 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:44 compute-1 podman[209389]: 2026-01-26 15:20:44.945870814 +0000 UTC m=+0.124074596 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20260120, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4)
Jan 26 15:20:47 compute-1 nova_compute[183403]: 2026-01-26 15:20:47.035 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:49 compute-1 openstack_network_exporter[195610]: ERROR   15:20:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:20:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:20:49 compute-1 openstack_network_exporter[195610]: ERROR   15:20:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:20:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:20:49 compute-1 nova_compute[183403]: 2026-01-26 15:20:49.935 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:50 compute-1 nova_compute[183403]: 2026-01-26 15:20:50.895 183407 DEBUG oslo_concurrency.lockutils [None req-c75d0499-bf35-41f9-8d4f-08f1862b1f86 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Acquiring lock "de27c520-d8b0-42de-8d97-f48ffcc94a7e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:20:50 compute-1 nova_compute[183403]: 2026-01-26 15:20:50.895 183407 DEBUG oslo_concurrency.lockutils [None req-c75d0499-bf35-41f9-8d4f-08f1862b1f86 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Lock "de27c520-d8b0-42de-8d97-f48ffcc94a7e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:20:50 compute-1 nova_compute[183403]: 2026-01-26 15:20:50.896 183407 DEBUG oslo_concurrency.lockutils [None req-c75d0499-bf35-41f9-8d4f-08f1862b1f86 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Acquiring lock "de27c520-d8b0-42de-8d97-f48ffcc94a7e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:20:50 compute-1 nova_compute[183403]: 2026-01-26 15:20:50.896 183407 DEBUG oslo_concurrency.lockutils [None req-c75d0499-bf35-41f9-8d4f-08f1862b1f86 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Lock "de27c520-d8b0-42de-8d97-f48ffcc94a7e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:20:50 compute-1 nova_compute[183403]: 2026-01-26 15:20:50.896 183407 DEBUG oslo_concurrency.lockutils [None req-c75d0499-bf35-41f9-8d4f-08f1862b1f86 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Lock "de27c520-d8b0-42de-8d97-f48ffcc94a7e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:20:50 compute-1 nova_compute[183403]: 2026-01-26 15:20:50.915 183407 INFO nova.compute.manager [None req-c75d0499-bf35-41f9-8d4f-08f1862b1f86 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Terminating instance
Jan 26 15:20:51 compute-1 nova_compute[183403]: 2026-01-26 15:20:51.432 183407 DEBUG nova.compute.manager [None req-c75d0499-bf35-41f9-8d4f-08f1862b1f86 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Jan 26 15:20:51 compute-1 kernel: tap27dfabe3-07 (unregistering): left promiscuous mode
Jan 26 15:20:51 compute-1 NetworkManager[55716]: <info>  [1769440851.4653] device (tap27dfabe3-07): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 15:20:51 compute-1 ovn_controller[95641]: 2026-01-26T15:20:51Z|00136|binding|INFO|Releasing lport 27dfabe3-07b8-4c53-a020-e39d337bc11c from this chassis (sb_readonly=0)
Jan 26 15:20:51 compute-1 ovn_controller[95641]: 2026-01-26T15:20:51Z|00137|binding|INFO|Setting lport 27dfabe3-07b8-4c53-a020-e39d337bc11c down in Southbound
Jan 26 15:20:51 compute-1 ovn_controller[95641]: 2026-01-26T15:20:51Z|00138|binding|INFO|Removing iface tap27dfabe3-07 ovn-installed in OVS
Jan 26 15:20:51 compute-1 nova_compute[183403]: 2026-01-26 15:20:51.475 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:51 compute-1 nova_compute[183403]: 2026-01-26 15:20:51.476 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:51 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:51.488 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:67:fb:7d 10.100.0.13'], port_security=['fa:16:3e:67:fb:7d 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'de27c520-d8b0-42de-8d97-f48ffcc94a7e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0df777d6-b389-44bc-b166-8208ab926234', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ed11f66f0de4f6191def09f65c67624', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'e54c8a16-0c91-4c9d-aa33-a10f9396fada', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ffea2f62-0986-47bd-a80c-89f1c9decf3f, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], logical_port=27dfabe3-07b8-4c53-a020-e39d337bc11c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:20:51 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:51.489 104930 INFO neutron.agent.ovn.metadata.agent [-] Port 27dfabe3-07b8-4c53-a020-e39d337bc11c in datapath 0df777d6-b389-44bc-b166-8208ab926234 unbound from our chassis
Jan 26 15:20:51 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:51.490 104930 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0df777d6-b389-44bc-b166-8208ab926234
Jan 26 15:20:51 compute-1 nova_compute[183403]: 2026-01-26 15:20:51.498 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:51 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:51.519 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[96576797-9400-4add-82a0-f436049ea794]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:20:51 compute-1 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000f.scope: Deactivated successfully.
Jan 26 15:20:51 compute-1 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000f.scope: Consumed 14.730s CPU time.
Jan 26 15:20:51 compute-1 systemd-machined[154697]: Machine qemu-11-instance-0000000f terminated.
Jan 26 15:20:51 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:51.562 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[ef8839b2-233b-4eea-a104-1daa08e2eeba]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:20:51 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:51.566 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[c01e70a6-10c0-4ffa-babd-41cfbae98268]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:20:51 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:51.601 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[5530e335-ca58-4113-b8a4-4947cf4c412a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:20:51 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:51.624 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[1ac335a8-6792-437a-8bff-f6d8b931ab6d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0df777d6-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:09:dc:c5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 452467, 'reachable_time': 33453, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209447, 'error': None, 'target': 'ovnmeta-0df777d6-b389-44bc-b166-8208ab926234', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:20:51 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:51.649 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[7adb2b38-8187-477c-83bc-6531120036b5]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap0df777d6-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 452484, 'tstamp': 452484}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209448, 'error': None, 'target': 'ovnmeta-0df777d6-b389-44bc-b166-8208ab926234', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap0df777d6-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 452488, 'tstamp': 452488}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209448, 'error': None, 'target': 'ovnmeta-0df777d6-b389-44bc-b166-8208ab926234', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:20:51 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:51.651 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0df777d6-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:20:51 compute-1 nova_compute[183403]: 2026-01-26 15:20:51.652 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:51 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:51.658 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0df777d6-b0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:20:51 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:51.658 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:20:51 compute-1 nova_compute[183403]: 2026-01-26 15:20:51.658 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:51 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:51.658 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0df777d6-b0, col_values=(('external_ids', {'iface-id': '33751b84-abc7-465c-a58b-58ca2b0cbc0a'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:20:51 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:51.659 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:20:51 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:51.660 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[96c13741-219b-47cb-961d-f6efee39b78b]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-0df777d6-b389-44bc-b166-8208ab926234\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/0df777d6-b389-44bc-b166-8208ab926234.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 0df777d6-b389-44bc-b166-8208ab926234\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:20:51 compute-1 nova_compute[183403]: 2026-01-26 15:20:51.661 183407 DEBUG nova.compute.manager [req-5eecddfc-28f1-45db-8ad9-521af7a3f0d5 req-3e4a8326-6506-4af6-afb8-4715299f7241 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Received event network-vif-unplugged-27dfabe3-07b8-4c53-a020-e39d337bc11c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:20:51 compute-1 nova_compute[183403]: 2026-01-26 15:20:51.662 183407 DEBUG oslo_concurrency.lockutils [req-5eecddfc-28f1-45db-8ad9-521af7a3f0d5 req-3e4a8326-6506-4af6-afb8-4715299f7241 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "de27c520-d8b0-42de-8d97-f48ffcc94a7e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:20:51 compute-1 nova_compute[183403]: 2026-01-26 15:20:51.662 183407 DEBUG oslo_concurrency.lockutils [req-5eecddfc-28f1-45db-8ad9-521af7a3f0d5 req-3e4a8326-6506-4af6-afb8-4715299f7241 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "de27c520-d8b0-42de-8d97-f48ffcc94a7e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:20:51 compute-1 nova_compute[183403]: 2026-01-26 15:20:51.662 183407 DEBUG oslo_concurrency.lockutils [req-5eecddfc-28f1-45db-8ad9-521af7a3f0d5 req-3e4a8326-6506-4af6-afb8-4715299f7241 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "de27c520-d8b0-42de-8d97-f48ffcc94a7e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:20:51 compute-1 nova_compute[183403]: 2026-01-26 15:20:51.663 183407 DEBUG nova.compute.manager [req-5eecddfc-28f1-45db-8ad9-521af7a3f0d5 req-3e4a8326-6506-4af6-afb8-4715299f7241 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] No waiting events found dispatching network-vif-unplugged-27dfabe3-07b8-4c53-a020-e39d337bc11c pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:20:51 compute-1 nova_compute[183403]: 2026-01-26 15:20:51.663 183407 DEBUG nova.compute.manager [req-5eecddfc-28f1-45db-8ad9-521af7a3f0d5 req-3e4a8326-6506-4af6-afb8-4715299f7241 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Received event network-vif-unplugged-27dfabe3-07b8-4c53-a020-e39d337bc11c for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 15:20:51 compute-1 nova_compute[183403]: 2026-01-26 15:20:51.663 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:51 compute-1 nova_compute[183403]: 2026-01-26 15:20:51.670 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:51 compute-1 nova_compute[183403]: 2026-01-26 15:20:51.731 183407 INFO nova.virt.libvirt.driver [-] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Instance destroyed successfully.
Jan 26 15:20:51 compute-1 nova_compute[183403]: 2026-01-26 15:20:51.731 183407 DEBUG nova.objects.instance [None req-c75d0499-bf35-41f9-8d4f-08f1862b1f86 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Lazy-loading 'resources' on Instance uuid de27c520-d8b0-42de-8d97-f48ffcc94a7e obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:20:52 compute-1 nova_compute[183403]: 2026-01-26 15:20:52.037 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:52 compute-1 nova_compute[183403]: 2026-01-26 15:20:52.239 183407 DEBUG nova.virt.libvirt.vif [None req-c75d0499-bf35-41f9-8d4f-08f1862b1f86 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2026-01-26T15:19:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1440107530',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1440107530',id=15,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T15:20:03Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3ed11f66f0de4f6191def09f65c67624',ramdisk_id='',reservation_id='r-01twax98',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1844876463',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1844876463-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T15:20:03Z,user_data=None,user_id='eabb3af6e41e4d9e883fc43bd03679db',uuid=de27c520-d8b0-42de-8d97-f48ffcc94a7e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "27dfabe3-07b8-4c53-a020-e39d337bc11c", "address": "fa:16:3e:67:fb:7d", "network": {"id": "0df777d6-b389-44bc-b166-8208ab926234", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1259048565-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2d0174cdadae4803981796a2cea457d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27dfabe3-07", "ovs_interfaceid": "27dfabe3-07b8-4c53-a020-e39d337bc11c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 26 15:20:52 compute-1 nova_compute[183403]: 2026-01-26 15:20:52.239 183407 DEBUG nova.network.os_vif_util [None req-c75d0499-bf35-41f9-8d4f-08f1862b1f86 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Converting VIF {"id": "27dfabe3-07b8-4c53-a020-e39d337bc11c", "address": "fa:16:3e:67:fb:7d", "network": {"id": "0df777d6-b389-44bc-b166-8208ab926234", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1259048565-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2d0174cdadae4803981796a2cea457d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27dfabe3-07", "ovs_interfaceid": "27dfabe3-07b8-4c53-a020-e39d337bc11c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:20:52 compute-1 nova_compute[183403]: 2026-01-26 15:20:52.240 183407 DEBUG nova.network.os_vif_util [None req-c75d0499-bf35-41f9-8d4f-08f1862b1f86 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:67:fb:7d,bridge_name='br-int',has_traffic_filtering=True,id=27dfabe3-07b8-4c53-a020-e39d337bc11c,network=Network(0df777d6-b389-44bc-b166-8208ab926234),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27dfabe3-07') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:20:52 compute-1 nova_compute[183403]: 2026-01-26 15:20:52.241 183407 DEBUG os_vif [None req-c75d0499-bf35-41f9-8d4f-08f1862b1f86 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:67:fb:7d,bridge_name='br-int',has_traffic_filtering=True,id=27dfabe3-07b8-4c53-a020-e39d337bc11c,network=Network(0df777d6-b389-44bc-b166-8208ab926234),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27dfabe3-07') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 26 15:20:52 compute-1 nova_compute[183403]: 2026-01-26 15:20:52.243 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:52 compute-1 nova_compute[183403]: 2026-01-26 15:20:52.244 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap27dfabe3-07, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:20:52 compute-1 nova_compute[183403]: 2026-01-26 15:20:52.245 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:52 compute-1 nova_compute[183403]: 2026-01-26 15:20:52.247 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:52 compute-1 nova_compute[183403]: 2026-01-26 15:20:52.248 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:52 compute-1 nova_compute[183403]: 2026-01-26 15:20:52.249 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=9b222775-bea0-4654-8701-b9ae07dc6ab3) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:20:52 compute-1 nova_compute[183403]: 2026-01-26 15:20:52.249 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:52 compute-1 nova_compute[183403]: 2026-01-26 15:20:52.251 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:52 compute-1 nova_compute[183403]: 2026-01-26 15:20:52.256 183407 INFO os_vif [None req-c75d0499-bf35-41f9-8d4f-08f1862b1f86 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:67:fb:7d,bridge_name='br-int',has_traffic_filtering=True,id=27dfabe3-07b8-4c53-a020-e39d337bc11c,network=Network(0df777d6-b389-44bc-b166-8208ab926234),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27dfabe3-07')
Jan 26 15:20:52 compute-1 nova_compute[183403]: 2026-01-26 15:20:52.257 183407 INFO nova.virt.libvirt.driver [None req-c75d0499-bf35-41f9-8d4f-08f1862b1f86 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Deleting instance files /var/lib/nova/instances/de27c520-d8b0-42de-8d97-f48ffcc94a7e_del
Jan 26 15:20:52 compute-1 nova_compute[183403]: 2026-01-26 15:20:52.259 183407 INFO nova.virt.libvirt.driver [None req-c75d0499-bf35-41f9-8d4f-08f1862b1f86 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Deletion of /var/lib/nova/instances/de27c520-d8b0-42de-8d97-f48ffcc94a7e_del complete
Jan 26 15:20:52 compute-1 nova_compute[183403]: 2026-01-26 15:20:52.628 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:52 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:52.628 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:ac:38', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'de:96:40:90:f3:e5'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:20:52 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:52.631 104930 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 15:20:52 compute-1 nova_compute[183403]: 2026-01-26 15:20:52.779 183407 INFO nova.compute.manager [None req-c75d0499-bf35-41f9-8d4f-08f1862b1f86 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Took 1.35 seconds to destroy the instance on the hypervisor.
Jan 26 15:20:52 compute-1 nova_compute[183403]: 2026-01-26 15:20:52.780 183407 DEBUG oslo.service.backend._eventlet.loopingcall [None req-c75d0499-bf35-41f9-8d4f-08f1862b1f86 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Jan 26 15:20:52 compute-1 nova_compute[183403]: 2026-01-26 15:20:52.780 183407 DEBUG nova.compute.manager [-] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Jan 26 15:20:52 compute-1 nova_compute[183403]: 2026-01-26 15:20:52.780 183407 DEBUG nova.network.neutron [-] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Jan 26 15:20:52 compute-1 nova_compute[183403]: 2026-01-26 15:20:52.780 183407 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:20:53 compute-1 nova_compute[183403]: 2026-01-26 15:20:53.168 183407 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:20:53 compute-1 nova_compute[183403]: 2026-01-26 15:20:53.501 183407 DEBUG nova.compute.manager [req-1859cc1b-79d8-4fbd-81ed-29595cb36f44 req-24afc37e-4708-44d9-86d1-02e7ffa085fa 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Received event network-vif-deleted-27dfabe3-07b8-4c53-a020-e39d337bc11c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:20:53 compute-1 nova_compute[183403]: 2026-01-26 15:20:53.502 183407 INFO nova.compute.manager [req-1859cc1b-79d8-4fbd-81ed-29595cb36f44 req-24afc37e-4708-44d9-86d1-02e7ffa085fa 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Neutron deleted interface 27dfabe3-07b8-4c53-a020-e39d337bc11c; detaching it from the instance and deleting it from the info cache
Jan 26 15:20:53 compute-1 nova_compute[183403]: 2026-01-26 15:20:53.502 183407 DEBUG nova.network.neutron [req-1859cc1b-79d8-4fbd-81ed-29595cb36f44 req-24afc37e-4708-44d9-86d1-02e7ffa085fa 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:20:53 compute-1 nova_compute[183403]: 2026-01-26 15:20:53.725 183407 DEBUG nova.compute.manager [req-0fafe82a-490f-4567-87ef-9ad200faa96f req-3e4f60db-4b75-4c39-b2cd-f1fb8c89e194 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Received event network-vif-unplugged-27dfabe3-07b8-4c53-a020-e39d337bc11c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:20:53 compute-1 nova_compute[183403]: 2026-01-26 15:20:53.725 183407 DEBUG oslo_concurrency.lockutils [req-0fafe82a-490f-4567-87ef-9ad200faa96f req-3e4f60db-4b75-4c39-b2cd-f1fb8c89e194 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "de27c520-d8b0-42de-8d97-f48ffcc94a7e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:20:53 compute-1 nova_compute[183403]: 2026-01-26 15:20:53.725 183407 DEBUG oslo_concurrency.lockutils [req-0fafe82a-490f-4567-87ef-9ad200faa96f req-3e4f60db-4b75-4c39-b2cd-f1fb8c89e194 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "de27c520-d8b0-42de-8d97-f48ffcc94a7e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:20:53 compute-1 nova_compute[183403]: 2026-01-26 15:20:53.726 183407 DEBUG oslo_concurrency.lockutils [req-0fafe82a-490f-4567-87ef-9ad200faa96f req-3e4f60db-4b75-4c39-b2cd-f1fb8c89e194 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "de27c520-d8b0-42de-8d97-f48ffcc94a7e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:20:53 compute-1 nova_compute[183403]: 2026-01-26 15:20:53.726 183407 DEBUG nova.compute.manager [req-0fafe82a-490f-4567-87ef-9ad200faa96f req-3e4f60db-4b75-4c39-b2cd-f1fb8c89e194 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] No waiting events found dispatching network-vif-unplugged-27dfabe3-07b8-4c53-a020-e39d337bc11c pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:20:53 compute-1 nova_compute[183403]: 2026-01-26 15:20:53.726 183407 DEBUG nova.compute.manager [req-0fafe82a-490f-4567-87ef-9ad200faa96f req-3e4f60db-4b75-4c39-b2cd-f1fb8c89e194 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Received event network-vif-unplugged-27dfabe3-07b8-4c53-a020-e39d337bc11c for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 15:20:53 compute-1 nova_compute[183403]: 2026-01-26 15:20:53.944 183407 DEBUG nova.network.neutron [-] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:20:54 compute-1 nova_compute[183403]: 2026-01-26 15:20:54.010 183407 DEBUG nova.compute.manager [req-1859cc1b-79d8-4fbd-81ed-29595cb36f44 req-24afc37e-4708-44d9-86d1-02e7ffa085fa 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Detach interface failed, port_id=27dfabe3-07b8-4c53-a020-e39d337bc11c, reason: Instance de27c520-d8b0-42de-8d97-f48ffcc94a7e could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Jan 26 15:20:54 compute-1 nova_compute[183403]: 2026-01-26 15:20:54.451 183407 INFO nova.compute.manager [-] [instance: de27c520-d8b0-42de-8d97-f48ffcc94a7e] Took 1.67 seconds to deallocate network for instance.
Jan 26 15:20:54 compute-1 nova_compute[183403]: 2026-01-26 15:20:54.939 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:54 compute-1 nova_compute[183403]: 2026-01-26 15:20:54.979 183407 DEBUG oslo_concurrency.lockutils [None req-c75d0499-bf35-41f9-8d4f-08f1862b1f86 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:20:54 compute-1 nova_compute[183403]: 2026-01-26 15:20:54.980 183407 DEBUG oslo_concurrency.lockutils [None req-c75d0499-bf35-41f9-8d4f-08f1862b1f86 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:20:55 compute-1 nova_compute[183403]: 2026-01-26 15:20:55.046 183407 DEBUG nova.compute.provider_tree [None req-c75d0499-bf35-41f9-8d4f-08f1862b1f86 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:20:55 compute-1 nova_compute[183403]: 2026-01-26 15:20:55.556 183407 DEBUG nova.scheduler.client.report [None req-c75d0499-bf35-41f9-8d4f-08f1862b1f86 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:20:56 compute-1 nova_compute[183403]: 2026-01-26 15:20:56.068 183407 DEBUG oslo_concurrency.lockutils [None req-c75d0499-bf35-41f9-8d4f-08f1862b1f86 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.088s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:20:56 compute-1 nova_compute[183403]: 2026-01-26 15:20:56.097 183407 INFO nova.scheduler.client.report [None req-c75d0499-bf35-41f9-8d4f-08f1862b1f86 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Deleted allocations for instance de27c520-d8b0-42de-8d97-f48ffcc94a7e
Jan 26 15:20:57 compute-1 nova_compute[183403]: 2026-01-26 15:20:57.131 183407 DEBUG oslo_concurrency.lockutils [None req-c75d0499-bf35-41f9-8d4f-08f1862b1f86 eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Lock "de27c520-d8b0-42de-8d97-f48ffcc94a7e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.236s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:20:57 compute-1 nova_compute[183403]: 2026-01-26 15:20:57.251 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:57 compute-1 nova_compute[183403]: 2026-01-26 15:20:57.877 183407 DEBUG oslo_concurrency.lockutils [None req-bdb2dd1c-4556-4430-bb0d-5c5306654cba eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Acquiring lock "f6352ae1-45e0-4f4e-9df8-f441325b4c16" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:20:57 compute-1 nova_compute[183403]: 2026-01-26 15:20:57.878 183407 DEBUG oslo_concurrency.lockutils [None req-bdb2dd1c-4556-4430-bb0d-5c5306654cba eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Lock "f6352ae1-45e0-4f4e-9df8-f441325b4c16" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:20:57 compute-1 nova_compute[183403]: 2026-01-26 15:20:57.878 183407 DEBUG oslo_concurrency.lockutils [None req-bdb2dd1c-4556-4430-bb0d-5c5306654cba eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Acquiring lock "f6352ae1-45e0-4f4e-9df8-f441325b4c16-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:20:57 compute-1 nova_compute[183403]: 2026-01-26 15:20:57.878 183407 DEBUG oslo_concurrency.lockutils [None req-bdb2dd1c-4556-4430-bb0d-5c5306654cba eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Lock "f6352ae1-45e0-4f4e-9df8-f441325b4c16-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:20:57 compute-1 nova_compute[183403]: 2026-01-26 15:20:57.879 183407 DEBUG oslo_concurrency.lockutils [None req-bdb2dd1c-4556-4430-bb0d-5c5306654cba eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Lock "f6352ae1-45e0-4f4e-9df8-f441325b4c16-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:20:57 compute-1 nova_compute[183403]: 2026-01-26 15:20:57.891 183407 INFO nova.compute.manager [None req-bdb2dd1c-4556-4430-bb0d-5c5306654cba eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: f6352ae1-45e0-4f4e-9df8-f441325b4c16] Terminating instance
Jan 26 15:20:58 compute-1 nova_compute[183403]: 2026-01-26 15:20:58.408 183407 DEBUG nova.compute.manager [None req-bdb2dd1c-4556-4430-bb0d-5c5306654cba eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: f6352ae1-45e0-4f4e-9df8-f441325b4c16] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Jan 26 15:20:58 compute-1 kernel: tap608de942-8c (unregistering): left promiscuous mode
Jan 26 15:20:58 compute-1 NetworkManager[55716]: <info>  [1769440858.4424] device (tap608de942-8c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 15:20:58 compute-1 ovn_controller[95641]: 2026-01-26T15:20:58Z|00139|binding|INFO|Releasing lport 608de942-8c66-4087-931c-0955fbd6c10b from this chassis (sb_readonly=0)
Jan 26 15:20:58 compute-1 ovn_controller[95641]: 2026-01-26T15:20:58Z|00140|binding|INFO|Setting lport 608de942-8c66-4087-931c-0955fbd6c10b down in Southbound
Jan 26 15:20:58 compute-1 nova_compute[183403]: 2026-01-26 15:20:58.451 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:58 compute-1 ovn_controller[95641]: 2026-01-26T15:20:58Z|00141|binding|INFO|Removing iface tap608de942-8c ovn-installed in OVS
Jan 26 15:20:58 compute-1 nova_compute[183403]: 2026-01-26 15:20:58.455 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:58 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:58.460 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:78:e5:dd 10.100.0.6'], port_security=['fa:16:3e:78:e5:dd 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'f6352ae1-45e0-4f4e-9df8-f441325b4c16', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0df777d6-b389-44bc-b166-8208ab926234', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ed11f66f0de4f6191def09f65c67624', 'neutron:revision_number': '15', 'neutron:security_group_ids': 'e54c8a16-0c91-4c9d-aa33-a10f9396fada', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ffea2f62-0986-47bd-a80c-89f1c9decf3f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], logical_port=608de942-8c66-4087-931c-0955fbd6c10b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:20:58 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:58.461 104930 INFO neutron.agent.ovn.metadata.agent [-] Port 608de942-8c66-4087-931c-0955fbd6c10b in datapath 0df777d6-b389-44bc-b166-8208ab926234 unbound from our chassis
Jan 26 15:20:58 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:58.462 104930 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0df777d6-b389-44bc-b166-8208ab926234, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 15:20:58 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:58.464 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[e24e7a48-62af-4c72-872c-580cc385a9ca]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:20:58 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:58.465 104930 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0df777d6-b389-44bc-b166-8208ab926234 namespace which is not needed anymore
Jan 26 15:20:58 compute-1 nova_compute[183403]: 2026-01-26 15:20:58.481 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:58 compute-1 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000000e.scope: Deactivated successfully.
Jan 26 15:20:58 compute-1 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000000e.scope: Consumed 2.651s CPU time.
Jan 26 15:20:58 compute-1 systemd-machined[154697]: Machine qemu-12-instance-0000000e terminated.
Jan 26 15:20:58 compute-1 podman[209493]: 2026-01-26 15:20:58.615941179 +0000 UTC m=+0.047156700 container kill f68d499bda40224e8a02ad284a1e95037c16fd53dc91e51202e1e07477e09312 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-0df777d6-b389-44bc-b166-8208ab926234, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true)
Jan 26 15:20:58 compute-1 neutron-haproxy-ovnmeta-0df777d6-b389-44bc-b166-8208ab926234[209144]: [NOTICE]   (209148) : haproxy version is 3.0.5-8e879a5
Jan 26 15:20:58 compute-1 neutron-haproxy-ovnmeta-0df777d6-b389-44bc-b166-8208ab926234[209144]: [NOTICE]   (209148) : path to executable is /usr/sbin/haproxy
Jan 26 15:20:58 compute-1 neutron-haproxy-ovnmeta-0df777d6-b389-44bc-b166-8208ab926234[209144]: [WARNING]  (209148) : Exiting Master process...
Jan 26 15:20:58 compute-1 nova_compute[183403]: 2026-01-26 15:20:58.616 183407 DEBUG nova.compute.manager [req-fc8c902a-5f7c-4721-9b83-7ad0a82b6d56 req-e18bf36c-8f01-49b1-99be-6cca7e0cb3fc 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: f6352ae1-45e0-4f4e-9df8-f441325b4c16] Received event network-vif-unplugged-608de942-8c66-4087-931c-0955fbd6c10b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:20:58 compute-1 nova_compute[183403]: 2026-01-26 15:20:58.617 183407 DEBUG oslo_concurrency.lockutils [req-fc8c902a-5f7c-4721-9b83-7ad0a82b6d56 req-e18bf36c-8f01-49b1-99be-6cca7e0cb3fc 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "f6352ae1-45e0-4f4e-9df8-f441325b4c16-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:20:58 compute-1 nova_compute[183403]: 2026-01-26 15:20:58.617 183407 DEBUG oslo_concurrency.lockutils [req-fc8c902a-5f7c-4721-9b83-7ad0a82b6d56 req-e18bf36c-8f01-49b1-99be-6cca7e0cb3fc 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "f6352ae1-45e0-4f4e-9df8-f441325b4c16-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:20:58 compute-1 neutron-haproxy-ovnmeta-0df777d6-b389-44bc-b166-8208ab926234[209144]: [ALERT]    (209148) : Current worker (209150) exited with code 143 (Terminated)
Jan 26 15:20:58 compute-1 neutron-haproxy-ovnmeta-0df777d6-b389-44bc-b166-8208ab926234[209144]: [WARNING]  (209148) : All workers exited. Exiting... (0)
Jan 26 15:20:58 compute-1 nova_compute[183403]: 2026-01-26 15:20:58.618 183407 DEBUG oslo_concurrency.lockutils [req-fc8c902a-5f7c-4721-9b83-7ad0a82b6d56 req-e18bf36c-8f01-49b1-99be-6cca7e0cb3fc 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "f6352ae1-45e0-4f4e-9df8-f441325b4c16-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:20:58 compute-1 nova_compute[183403]: 2026-01-26 15:20:58.619 183407 DEBUG nova.compute.manager [req-fc8c902a-5f7c-4721-9b83-7ad0a82b6d56 req-e18bf36c-8f01-49b1-99be-6cca7e0cb3fc 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: f6352ae1-45e0-4f4e-9df8-f441325b4c16] No waiting events found dispatching network-vif-unplugged-608de942-8c66-4087-931c-0955fbd6c10b pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:20:58 compute-1 nova_compute[183403]: 2026-01-26 15:20:58.620 183407 DEBUG nova.compute.manager [req-fc8c902a-5f7c-4721-9b83-7ad0a82b6d56 req-e18bf36c-8f01-49b1-99be-6cca7e0cb3fc 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: f6352ae1-45e0-4f4e-9df8-f441325b4c16] Received event network-vif-unplugged-608de942-8c66-4087-931c-0955fbd6c10b for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 15:20:58 compute-1 systemd[1]: libpod-f68d499bda40224e8a02ad284a1e95037c16fd53dc91e51202e1e07477e09312.scope: Deactivated successfully.
Jan 26 15:20:58 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:58.632 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=41380b5a-e321-4ce4-bcc6-ecd563b3c793, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:20:58 compute-1 nova_compute[183403]: 2026-01-26 15:20:58.665 183407 INFO nova.virt.libvirt.driver [-] [instance: f6352ae1-45e0-4f4e-9df8-f441325b4c16] Instance destroyed successfully.
Jan 26 15:20:58 compute-1 nova_compute[183403]: 2026-01-26 15:20:58.666 183407 DEBUG nova.objects.instance [None req-bdb2dd1c-4556-4430-bb0d-5c5306654cba eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Lazy-loading 'resources' on Instance uuid f6352ae1-45e0-4f4e-9df8-f441325b4c16 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:20:58 compute-1 podman[209522]: 2026-01-26 15:20:58.677894404 +0000 UTC m=+0.030058184 container died f68d499bda40224e8a02ad284a1e95037c16fd53dc91e51202e1e07477e09312 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-0df777d6-b389-44bc-b166-8208ab926234, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4)
Jan 26 15:20:58 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f68d499bda40224e8a02ad284a1e95037c16fd53dc91e51202e1e07477e09312-userdata-shm.mount: Deactivated successfully.
Jan 26 15:20:58 compute-1 systemd[1]: var-lib-containers-storage-overlay-a9293a14416c9512048c96c12bad82b0e9962331635a5959f83e15160ddd64a6-merged.mount: Deactivated successfully.
Jan 26 15:20:58 compute-1 podman[209522]: 2026-01-26 15:20:58.735437945 +0000 UTC m=+0.087601725 container remove f68d499bda40224e8a02ad284a1e95037c16fd53dc91e51202e1e07477e09312 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-0df777d6-b389-44bc-b166-8208ab926234, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 15:20:58 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:58.744 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[80df676d-9b3c-4d7b-8ec3-cae49de033f8]: (4, ("Mon Jan 26 03:20:58 PM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-0df777d6-b389-44bc-b166-8208ab926234 (f68d499bda40224e8a02ad284a1e95037c16fd53dc91e51202e1e07477e09312)\nf68d499bda40224e8a02ad284a1e95037c16fd53dc91e51202e1e07477e09312\nMon Jan 26 03:20:58 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-0df777d6-b389-44bc-b166-8208ab926234 (f68d499bda40224e8a02ad284a1e95037c16fd53dc91e51202e1e07477e09312)\nf68d499bda40224e8a02ad284a1e95037c16fd53dc91e51202e1e07477e09312\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:20:58 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:58.747 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[d59f8123-9c66-49cc-a7ef-e9d383c1f75f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:20:58 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:58.748 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0df777d6-b389-44bc-b166-8208ab926234.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0df777d6-b389-44bc-b166-8208ab926234.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:20:58 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:58.749 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[3b6138d1-93c2-4612-8f8a-1dd34a26291b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:20:58 compute-1 systemd[1]: libpod-conmon-f68d499bda40224e8a02ad284a1e95037c16fd53dc91e51202e1e07477e09312.scope: Deactivated successfully.
Jan 26 15:20:58 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:58.750 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0df777d6-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:20:58 compute-1 nova_compute[183403]: 2026-01-26 15:20:58.753 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:58 compute-1 kernel: tap0df777d6-b0: left promiscuous mode
Jan 26 15:20:58 compute-1 nova_compute[183403]: 2026-01-26 15:20:58.767 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:58 compute-1 nova_compute[183403]: 2026-01-26 15:20:58.771 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:58 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:58.774 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[1921e252-97f4-474f-a3db-68ca98fc6afc]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:20:58 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:58.801 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[00e2c8f8-5854-4641-82dc-6035568e145e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:20:58 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:58.802 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[482a22d9-b930-4354-a2c6-e2882fd29e42]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:20:58 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:58.821 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[7e6c7085-1cfd-4a66-bb43-da5d8216f6e3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 452458, 'reachable_time': 20274, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209559, 'error': None, 'target': 'ovnmeta-0df777d6-b389-44bc-b166-8208ab926234', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:20:58 compute-1 systemd[1]: run-netns-ovnmeta\x2d0df777d6\x2db389\x2d44bc\x2db166\x2d8208ab926234.mount: Deactivated successfully.
Jan 26 15:20:58 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:58.825 105448 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0df777d6-b389-44bc-b166-8208ab926234 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Jan 26 15:20:58 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:20:58.826 105448 DEBUG oslo.privsep.daemon [-] privsep: reply[42fdd406-e614-4a8e-a4b0-35357867ed4b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:20:59 compute-1 nova_compute[183403]: 2026-01-26 15:20:59.171 183407 DEBUG nova.virt.libvirt.vif [None req-bdb2dd1c-4556-4430-bb0d-5c5306654cba eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2026-01-26T15:19:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1014049427',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1014049427',id=14,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T15:19:43Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3ed11f66f0de4f6191def09f65c67624',ramdisk_id='',reservation_id='r-mvifsctc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',clean_attempts='1',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1844876463',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1844876463-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T15:20:41Z,user_data=None,user_id='eabb3af6e41e4d9e883fc43bd03679db',uuid=f6352ae1-45e0-4f4e-9df8-f441325b4c16,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "608de942-8c66-4087-931c-0955fbd6c10b", "address": "fa:16:3e:78:e5:dd", "network": {"id": "0df777d6-b389-44bc-b166-8208ab926234", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1259048565-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2d0174cdadae4803981796a2cea457d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap608de942-8c", "ovs_interfaceid": "608de942-8c66-4087-931c-0955fbd6c10b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 26 15:20:59 compute-1 nova_compute[183403]: 2026-01-26 15:20:59.172 183407 DEBUG nova.network.os_vif_util [None req-bdb2dd1c-4556-4430-bb0d-5c5306654cba eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Converting VIF {"id": "608de942-8c66-4087-931c-0955fbd6c10b", "address": "fa:16:3e:78:e5:dd", "network": {"id": "0df777d6-b389-44bc-b166-8208ab926234", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1259048565-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2d0174cdadae4803981796a2cea457d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap608de942-8c", "ovs_interfaceid": "608de942-8c66-4087-931c-0955fbd6c10b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:20:59 compute-1 nova_compute[183403]: 2026-01-26 15:20:59.172 183407 DEBUG nova.network.os_vif_util [None req-bdb2dd1c-4556-4430-bb0d-5c5306654cba eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:78:e5:dd,bridge_name='br-int',has_traffic_filtering=True,id=608de942-8c66-4087-931c-0955fbd6c10b,network=Network(0df777d6-b389-44bc-b166-8208ab926234),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap608de942-8c') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:20:59 compute-1 nova_compute[183403]: 2026-01-26 15:20:59.173 183407 DEBUG os_vif [None req-bdb2dd1c-4556-4430-bb0d-5c5306654cba eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:78:e5:dd,bridge_name='br-int',has_traffic_filtering=True,id=608de942-8c66-4087-931c-0955fbd6c10b,network=Network(0df777d6-b389-44bc-b166-8208ab926234),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap608de942-8c') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 26 15:20:59 compute-1 nova_compute[183403]: 2026-01-26 15:20:59.174 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:59 compute-1 nova_compute[183403]: 2026-01-26 15:20:59.174 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap608de942-8c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:20:59 compute-1 nova_compute[183403]: 2026-01-26 15:20:59.205 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:59 compute-1 nova_compute[183403]: 2026-01-26 15:20:59.207 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:59 compute-1 nova_compute[183403]: 2026-01-26 15:20:59.208 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:59 compute-1 nova_compute[183403]: 2026-01-26 15:20:59.208 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=511ac969-cce1-48aa-b337-369e40dcb0a7) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:20:59 compute-1 nova_compute[183403]: 2026-01-26 15:20:59.209 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:59 compute-1 nova_compute[183403]: 2026-01-26 15:20:59.210 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:20:59 compute-1 nova_compute[183403]: 2026-01-26 15:20:59.212 183407 INFO os_vif [None req-bdb2dd1c-4556-4430-bb0d-5c5306654cba eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:78:e5:dd,bridge_name='br-int',has_traffic_filtering=True,id=608de942-8c66-4087-931c-0955fbd6c10b,network=Network(0df777d6-b389-44bc-b166-8208ab926234),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap608de942-8c')
Jan 26 15:20:59 compute-1 nova_compute[183403]: 2026-01-26 15:20:59.212 183407 INFO nova.virt.libvirt.driver [None req-bdb2dd1c-4556-4430-bb0d-5c5306654cba eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: f6352ae1-45e0-4f4e-9df8-f441325b4c16] Deleting instance files /var/lib/nova/instances/f6352ae1-45e0-4f4e-9df8-f441325b4c16_del
Jan 26 15:20:59 compute-1 nova_compute[183403]: 2026-01-26 15:20:59.213 183407 INFO nova.virt.libvirt.driver [None req-bdb2dd1c-4556-4430-bb0d-5c5306654cba eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: f6352ae1-45e0-4f4e-9df8-f441325b4c16] Deletion of /var/lib/nova/instances/f6352ae1-45e0-4f4e-9df8-f441325b4c16_del complete
Jan 26 15:20:59 compute-1 nova_compute[183403]: 2026-01-26 15:20:59.729 183407 INFO nova.compute.manager [None req-bdb2dd1c-4556-4430-bb0d-5c5306654cba eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] [instance: f6352ae1-45e0-4f4e-9df8-f441325b4c16] Took 1.32 seconds to destroy the instance on the hypervisor.
Jan 26 15:20:59 compute-1 nova_compute[183403]: 2026-01-26 15:20:59.730 183407 DEBUG oslo.service.backend._eventlet.loopingcall [None req-bdb2dd1c-4556-4430-bb0d-5c5306654cba eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Jan 26 15:20:59 compute-1 nova_compute[183403]: 2026-01-26 15:20:59.730 183407 DEBUG nova.compute.manager [-] [instance: f6352ae1-45e0-4f4e-9df8-f441325b4c16] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Jan 26 15:20:59 compute-1 nova_compute[183403]: 2026-01-26 15:20:59.730 183407 DEBUG nova.network.neutron [-] [instance: f6352ae1-45e0-4f4e-9df8-f441325b4c16] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Jan 26 15:20:59 compute-1 nova_compute[183403]: 2026-01-26 15:20:59.731 183407 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:20:59 compute-1 nova_compute[183403]: 2026-01-26 15:20:59.943 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:21:00 compute-1 nova_compute[183403]: 2026-01-26 15:21:00.600 183407 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:21:00 compute-1 nova_compute[183403]: 2026-01-26 15:21:00.685 183407 DEBUG nova.compute.manager [req-3ee29ded-bcf9-452c-99ff-451e0a0932ac req-62f37d41-9169-4947-a91b-92325d6787cd 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: f6352ae1-45e0-4f4e-9df8-f441325b4c16] Received event network-vif-unplugged-608de942-8c66-4087-931c-0955fbd6c10b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:21:00 compute-1 nova_compute[183403]: 2026-01-26 15:21:00.686 183407 DEBUG oslo_concurrency.lockutils [req-3ee29ded-bcf9-452c-99ff-451e0a0932ac req-62f37d41-9169-4947-a91b-92325d6787cd 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "f6352ae1-45e0-4f4e-9df8-f441325b4c16-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:21:00 compute-1 nova_compute[183403]: 2026-01-26 15:21:00.686 183407 DEBUG oslo_concurrency.lockutils [req-3ee29ded-bcf9-452c-99ff-451e0a0932ac req-62f37d41-9169-4947-a91b-92325d6787cd 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "f6352ae1-45e0-4f4e-9df8-f441325b4c16-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:21:00 compute-1 nova_compute[183403]: 2026-01-26 15:21:00.686 183407 DEBUG oslo_concurrency.lockutils [req-3ee29ded-bcf9-452c-99ff-451e0a0932ac req-62f37d41-9169-4947-a91b-92325d6787cd 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "f6352ae1-45e0-4f4e-9df8-f441325b4c16-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:21:00 compute-1 nova_compute[183403]: 2026-01-26 15:21:00.686 183407 DEBUG nova.compute.manager [req-3ee29ded-bcf9-452c-99ff-451e0a0932ac req-62f37d41-9169-4947-a91b-92325d6787cd 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: f6352ae1-45e0-4f4e-9df8-f441325b4c16] No waiting events found dispatching network-vif-unplugged-608de942-8c66-4087-931c-0955fbd6c10b pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:21:00 compute-1 nova_compute[183403]: 2026-01-26 15:21:00.687 183407 DEBUG nova.compute.manager [req-3ee29ded-bcf9-452c-99ff-451e0a0932ac req-62f37d41-9169-4947-a91b-92325d6787cd 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: f6352ae1-45e0-4f4e-9df8-f441325b4c16] Received event network-vif-unplugged-608de942-8c66-4087-931c-0955fbd6c10b for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 15:21:01 compute-1 nova_compute[183403]: 2026-01-26 15:21:01.649 183407 DEBUG nova.compute.manager [req-38b67b39-9509-4baa-a6ff-deedfe028593 req-dc16acef-d0ca-4af0-9e6e-18656be2d0e9 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: f6352ae1-45e0-4f4e-9df8-f441325b4c16] Received event network-vif-deleted-608de942-8c66-4087-931c-0955fbd6c10b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:21:01 compute-1 nova_compute[183403]: 2026-01-26 15:21:01.649 183407 INFO nova.compute.manager [req-38b67b39-9509-4baa-a6ff-deedfe028593 req-dc16acef-d0ca-4af0-9e6e-18656be2d0e9 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: f6352ae1-45e0-4f4e-9df8-f441325b4c16] Neutron deleted interface 608de942-8c66-4087-931c-0955fbd6c10b; detaching it from the instance and deleting it from the info cache
Jan 26 15:21:01 compute-1 nova_compute[183403]: 2026-01-26 15:21:01.649 183407 DEBUG nova.network.neutron [req-38b67b39-9509-4baa-a6ff-deedfe028593 req-dc16acef-d0ca-4af0-9e6e-18656be2d0e9 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: f6352ae1-45e0-4f4e-9df8-f441325b4c16] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:21:02 compute-1 nova_compute[183403]: 2026-01-26 15:21:02.094 183407 DEBUG nova.network.neutron [-] [instance: f6352ae1-45e0-4f4e-9df8-f441325b4c16] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:21:02 compute-1 nova_compute[183403]: 2026-01-26 15:21:02.157 183407 DEBUG nova.compute.manager [req-38b67b39-9509-4baa-a6ff-deedfe028593 req-dc16acef-d0ca-4af0-9e6e-18656be2d0e9 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: f6352ae1-45e0-4f4e-9df8-f441325b4c16] Detach interface failed, port_id=608de942-8c66-4087-931c-0955fbd6c10b, reason: Instance f6352ae1-45e0-4f4e-9df8-f441325b4c16 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Jan 26 15:21:02 compute-1 nova_compute[183403]: 2026-01-26 15:21:02.601 183407 INFO nova.compute.manager [-] [instance: f6352ae1-45e0-4f4e-9df8-f441325b4c16] Took 2.87 seconds to deallocate network for instance.
Jan 26 15:21:03 compute-1 nova_compute[183403]: 2026-01-26 15:21:03.127 183407 DEBUG oslo_concurrency.lockutils [None req-bdb2dd1c-4556-4430-bb0d-5c5306654cba eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:21:03 compute-1 nova_compute[183403]: 2026-01-26 15:21:03.128 183407 DEBUG oslo_concurrency.lockutils [None req-bdb2dd1c-4556-4430-bb0d-5c5306654cba eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:21:03 compute-1 nova_compute[183403]: 2026-01-26 15:21:03.134 183407 DEBUG oslo_concurrency.lockutils [None req-bdb2dd1c-4556-4430-bb0d-5c5306654cba eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.006s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:21:03 compute-1 nova_compute[183403]: 2026-01-26 15:21:03.169 183407 INFO nova.scheduler.client.report [None req-bdb2dd1c-4556-4430-bb0d-5c5306654cba eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Deleted allocations for instance f6352ae1-45e0-4f4e-9df8-f441325b4c16
Jan 26 15:21:04 compute-1 nova_compute[183403]: 2026-01-26 15:21:04.197 183407 DEBUG oslo_concurrency.lockutils [None req-bdb2dd1c-4556-4430-bb0d-5c5306654cba eabb3af6e41e4d9e883fc43bd03679db 3ed11f66f0de4f6191def09f65c67624 - - default default] Lock "f6352ae1-45e0-4f4e-9df8-f441325b4c16" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.319s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:21:04 compute-1 nova_compute[183403]: 2026-01-26 15:21:04.249 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:21:04 compute-1 nova_compute[183403]: 2026-01-26 15:21:04.944 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:21:05 compute-1 podman[192725]: time="2026-01-26T15:21:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:21:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:21:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:21:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:21:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2194 "" "Go-http-client/1.1"
Jan 26 15:21:07 compute-1 podman[209560]: 2026-01-26 15:21:07.884071411 +0000 UTC m=+0.059667377 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 26 15:21:07 compute-1 podman[209561]: 2026-01-26 15:21:07.923213862 +0000 UTC m=+0.092353289 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.expose-services=, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, architecture=x86_64, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 26 15:21:09 compute-1 nova_compute[183403]: 2026-01-26 15:21:09.412 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:21:09 compute-1 nova_compute[183403]: 2026-01-26 15:21:09.945 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:21:10 compute-1 nova_compute[183403]: 2026-01-26 15:21:10.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:21:10 compute-1 nova_compute[183403]: 2026-01-26 15:21:10.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:21:10 compute-1 nova_compute[183403]: 2026-01-26 15:21:10.862 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:21:11 compute-1 nova_compute[183403]: 2026-01-26 15:21:11.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:21:12 compute-1 nova_compute[183403]: 2026-01-26 15:21:12.092 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:21:12 compute-1 nova_compute[183403]: 2026-01-26 15:21:12.092 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:21:12 compute-1 nova_compute[183403]: 2026-01-26 15:21:12.092 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:21:12 compute-1 nova_compute[183403]: 2026-01-26 15:21:12.092 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:21:12 compute-1 nova_compute[183403]: 2026-01-26 15:21:12.286 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:21:12 compute-1 nova_compute[183403]: 2026-01-26 15:21:12.287 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:21:12 compute-1 nova_compute[183403]: 2026-01-26 15:21:12.325 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.038s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:21:12 compute-1 nova_compute[183403]: 2026-01-26 15:21:12.326 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5802MB free_disk=73.14486312866211GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:21:12 compute-1 nova_compute[183403]: 2026-01-26 15:21:12.326 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:21:12 compute-1 nova_compute[183403]: 2026-01-26 15:21:12.327 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:21:13 compute-1 nova_compute[183403]: 2026-01-26 15:21:13.376 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:21:13 compute-1 nova_compute[183403]: 2026-01-26 15:21:13.376 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:21:12 up  1:16,  0 user,  load average: 0.83, 0.40, 0.34\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:21:13 compute-1 nova_compute[183403]: 2026-01-26 15:21:13.400 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:21:13 compute-1 nova_compute[183403]: 2026-01-26 15:21:13.906 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:21:14 compute-1 nova_compute[183403]: 2026-01-26 15:21:14.414 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:21:14 compute-1 nova_compute[183403]: 2026-01-26 15:21:14.420 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:21:14 compute-1 nova_compute[183403]: 2026-01-26 15:21:14.420 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.094s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:21:14 compute-1 nova_compute[183403]: 2026-01-26 15:21:14.947 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:21:15 compute-1 podman[209608]: 2026-01-26 15:21:15.915089946 +0000 UTC m=+0.086004683 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 26 15:21:15 compute-1 podman[209607]: 2026-01-26 15:21:15.936551566 +0000 UTC m=+0.119304761 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 15:21:16 compute-1 nova_compute[183403]: 2026-01-26 15:21:16.420 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:21:16 compute-1 nova_compute[183403]: 2026-01-26 15:21:16.421 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:21:16 compute-1 nova_compute[183403]: 2026-01-26 15:21:16.422 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 15:21:16 compute-1 nova_compute[183403]: 2026-01-26 15:21:16.577 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:21:17 compute-1 nova_compute[183403]: 2026-01-26 15:21:17.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:21:18 compute-1 nova_compute[183403]: 2026-01-26 15:21:18.571 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:21:18 compute-1 nova_compute[183403]: 2026-01-26 15:21:18.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:21:18 compute-1 nova_compute[183403]: 2026-01-26 15:21:18.576 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Jan 26 15:21:19 compute-1 nova_compute[183403]: 2026-01-26 15:21:19.085 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Jan 26 15:21:19 compute-1 nova_compute[183403]: 2026-01-26 15:21:19.416 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:21:19 compute-1 openstack_network_exporter[195610]: ERROR   15:21:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:21:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:21:19 compute-1 openstack_network_exporter[195610]: ERROR   15:21:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:21:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:21:19 compute-1 nova_compute[183403]: 2026-01-26 15:21:19.948 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:21:22 compute-1 nova_compute[183403]: 2026-01-26 15:21:22.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:21:22 compute-1 nova_compute[183403]: 2026-01-26 15:21:22.577 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Jan 26 15:21:23 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:21:23.341 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:63:8d:23 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-3e9dfa8e-4100-40c2-b5c3-611e27e3b601', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3e9dfa8e-4100-40c2-b5c3-611e27e3b601', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9147aacc62b34487b1727939f1a92703', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9d8c1c3b-77a7-4237-87f3-33f734737e5d, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=46bf4c6f-a210-4e32-9067-075b592fdafa) old=Port_Binding(mac=['fa:16:3e:63:8d:23'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-3e9dfa8e-4100-40c2-b5c3-611e27e3b601', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3e9dfa8e-4100-40c2-b5c3-611e27e3b601', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9147aacc62b34487b1727939f1a92703', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:21:23 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:21:23.342 104930 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 46bf4c6f-a210-4e32-9067-075b592fdafa in datapath 3e9dfa8e-4100-40c2-b5c3-611e27e3b601 updated
Jan 26 15:21:23 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:21:23.343 104930 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3e9dfa8e-4100-40c2-b5c3-611e27e3b601, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 15:21:23 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:21:23.344 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[821d01e2-3f47-4f8e-ad5f-de337e497b91]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:21:24 compute-1 nova_compute[183403]: 2026-01-26 15:21:24.418 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:21:24 compute-1 nova_compute[183403]: 2026-01-26 15:21:24.950 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:21:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:21:29.057 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:21:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:21:29.057 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:21:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:21:29.057 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:21:29 compute-1 nova_compute[183403]: 2026-01-26 15:21:29.421 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:21:29 compute-1 nova_compute[183403]: 2026-01-26 15:21:29.952 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:21:34 compute-1 nova_compute[183403]: 2026-01-26 15:21:34.423 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:21:34 compute-1 nova_compute[183403]: 2026-01-26 15:21:34.955 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:21:35 compute-1 podman[192725]: time="2026-01-26T15:21:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:21:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:21:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:21:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:21:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2193 "" "Go-http-client/1.1"
Jan 26 15:21:36 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:21:36.732 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ed:94:3f 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-e0c5ac05-30df-4e8b-a2e8-19c05ebe1a7b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e0c5ac05-30df-4e8b-a2e8-19c05ebe1a7b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e07345fa9028494086d0d062e5c6d037', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cd5cb898-e1de-4995-a0a2-1ae7a22237dd, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=7bbf8fe1-99ef-4dcd-b225-9848d16dc24f) old=Port_Binding(mac=['fa:16:3e:ed:94:3f'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-e0c5ac05-30df-4e8b-a2e8-19c05ebe1a7b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e0c5ac05-30df-4e8b-a2e8-19c05ebe1a7b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e07345fa9028494086d0d062e5c6d037', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:21:36 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:21:36.733 104930 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 7bbf8fe1-99ef-4dcd-b225-9848d16dc24f in datapath e0c5ac05-30df-4e8b-a2e8-19c05ebe1a7b updated
Jan 26 15:21:36 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:21:36.734 104930 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e0c5ac05-30df-4e8b-a2e8-19c05ebe1a7b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 15:21:36 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:21:36.734 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[5ed41a0a-8062-4b24-bccf-1d6d5bd2c315]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:21:37 compute-1 nova_compute[183403]: 2026-01-26 15:21:37.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:21:38 compute-1 podman[209655]: 2026-01-26 15:21:38.900728573 +0000 UTC m=+0.074557764 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-type=git, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., release=1755695350, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, managed_by=edpm_ansible)
Jan 26 15:21:38 compute-1 podman[209654]: 2026-01-26 15:21:38.90714196 +0000 UTC m=+0.086072204 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 15:21:39 compute-1 nova_compute[183403]: 2026-01-26 15:21:39.424 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:21:39 compute-1 nova_compute[183403]: 2026-01-26 15:21:39.956 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:21:44 compute-1 nova_compute[183403]: 2026-01-26 15:21:44.426 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:21:44 compute-1 ovn_controller[95641]: 2026-01-26T15:21:44Z|00142|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 26 15:21:44 compute-1 nova_compute[183403]: 2026-01-26 15:21:44.958 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:21:46 compute-1 podman[209699]: 2026-01-26 15:21:46.904098785 +0000 UTC m=+0.064025339 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 15:21:46 compute-1 podman[209698]: 2026-01-26 15:21:46.944778205 +0000 UTC m=+0.108144848 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4)
Jan 26 15:21:49 compute-1 openstack_network_exporter[195610]: ERROR   15:21:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:21:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:21:49 compute-1 openstack_network_exporter[195610]: ERROR   15:21:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:21:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:21:49 compute-1 nova_compute[183403]: 2026-01-26 15:21:49.428 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:21:49 compute-1 nova_compute[183403]: 2026-01-26 15:21:49.961 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:21:54 compute-1 nova_compute[183403]: 2026-01-26 15:21:54.431 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:21:54 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:21:54.698 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:ac:38', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'de:96:40:90:f3:e5'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:21:54 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:21:54.699 104930 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 15:21:54 compute-1 nova_compute[183403]: 2026-01-26 15:21:54.700 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:21:54 compute-1 nova_compute[183403]: 2026-01-26 15:21:54.991 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:21:59 compute-1 nova_compute[183403]: 2026-01-26 15:21:59.440 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:21:59 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:21:59.701 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=41380b5a-e321-4ce4-bcc6-ecd563b3c793, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:21:59 compute-1 nova_compute[183403]: 2026-01-26 15:21:59.994 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:22:04 compute-1 nova_compute[183403]: 2026-01-26 15:22:04.443 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:22:04 compute-1 nova_compute[183403]: 2026-01-26 15:22:04.996 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:22:05 compute-1 podman[192725]: time="2026-01-26T15:22:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:22:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:22:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:22:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:22:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2189 "" "Go-http-client/1.1"
Jan 26 15:22:09 compute-1 nova_compute[183403]: 2026-01-26 15:22:09.471 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:22:09 compute-1 podman[209741]: 2026-01-26 15:22:09.920262781 +0000 UTC m=+0.086244669 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 15:22:09 compute-1 podman[209742]: 2026-01-26 15:22:09.940744364 +0000 UTC m=+0.103438507 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, version=9.6, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.expose-services=, release=1755695350, vendor=Red Hat, Inc., name=ubi9-minimal, maintainer=Red Hat, Inc., io.buildah.version=1.33.7)
Jan 26 15:22:09 compute-1 nova_compute[183403]: 2026-01-26 15:22:09.998 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:22:11 compute-1 nova_compute[183403]: 2026-01-26 15:22:11.089 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:22:11 compute-1 nova_compute[183403]: 2026-01-26 15:22:11.090 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:22:11 compute-1 nova_compute[183403]: 2026-01-26 15:22:11.365 183407 DEBUG oslo_concurrency.lockutils [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Acquiring lock "67d3bb43-d956-4beb-8227-316914d585d8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:22:11 compute-1 nova_compute[183403]: 2026-01-26 15:22:11.366 183407 DEBUG oslo_concurrency.lockutils [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Lock "67d3bb43-d956-4beb-8227-316914d585d8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:22:11 compute-1 nova_compute[183403]: 2026-01-26 15:22:11.874 183407 DEBUG nova.compute.manager [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Jan 26 15:22:12 compute-1 nova_compute[183403]: 2026-01-26 15:22:12.538 183407 DEBUG oslo_concurrency.lockutils [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:22:12 compute-1 nova_compute[183403]: 2026-01-26 15:22:12.538 183407 DEBUG oslo_concurrency.lockutils [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:22:12 compute-1 nova_compute[183403]: 2026-01-26 15:22:12.548 183407 DEBUG nova.virt.hardware [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Jan 26 15:22:12 compute-1 nova_compute[183403]: 2026-01-26 15:22:12.549 183407 INFO nova.compute.claims [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Claim successful on node compute-1.ctlplane.example.com
Jan 26 15:22:12 compute-1 nova_compute[183403]: 2026-01-26 15:22:12.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:22:13 compute-1 nova_compute[183403]: 2026-01-26 15:22:13.089 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:22:13 compute-1 nova_compute[183403]: 2026-01-26 15:22:13.630 183407 DEBUG nova.scheduler.client.report [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Refreshing inventories for resource provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Jan 26 15:22:13 compute-1 nova_compute[183403]: 2026-01-26 15:22:13.646 183407 DEBUG nova.scheduler.client.report [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Updating ProviderTree inventory for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Jan 26 15:22:13 compute-1 nova_compute[183403]: 2026-01-26 15:22:13.647 183407 DEBUG nova.compute.provider_tree [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Updating inventory in ProviderTree for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Jan 26 15:22:13 compute-1 nova_compute[183403]: 2026-01-26 15:22:13.659 183407 DEBUG nova.scheduler.client.report [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Refreshing aggregate associations for resource provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Jan 26 15:22:13 compute-1 nova_compute[183403]: 2026-01-26 15:22:13.681 183407 DEBUG nova.scheduler.client.report [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Refreshing trait associations for resource provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67, traits: COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE41,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NODE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_MMX,COMPUTE_ACCELERATORS,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_ARCH_X86_64,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE2,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_SOUND_MODEL_USB,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_TIS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_DEVICE_TAGGING,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_CRB,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_SOUND_MODEL_SB16,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_QCOW2,HW_ARCH_X86_64,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SECURITY_TPM_2_0 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Jan 26 15:22:13 compute-1 nova_compute[183403]: 2026-01-26 15:22:13.725 183407 DEBUG nova.compute.provider_tree [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:22:14 compute-1 nova_compute[183403]: 2026-01-26 15:22:14.234 183407 DEBUG nova.scheduler.client.report [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:22:14 compute-1 nova_compute[183403]: 2026-01-26 15:22:14.512 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:22:14 compute-1 nova_compute[183403]: 2026-01-26 15:22:14.744 183407 DEBUG oslo_concurrency.lockutils [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.205s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:22:14 compute-1 nova_compute[183403]: 2026-01-26 15:22:14.745 183407 DEBUG nova.compute.manager [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Jan 26 15:22:14 compute-1 nova_compute[183403]: 2026-01-26 15:22:14.750 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 1.661s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:22:14 compute-1 nova_compute[183403]: 2026-01-26 15:22:14.750 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:22:14 compute-1 nova_compute[183403]: 2026-01-26 15:22:14.750 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:22:14 compute-1 nova_compute[183403]: 2026-01-26 15:22:14.979 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:22:14 compute-1 nova_compute[183403]: 2026-01-26 15:22:14.981 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:22:15 compute-1 nova_compute[183403]: 2026-01-26 15:22:15.001 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:22:15 compute-1 nova_compute[183403]: 2026-01-26 15:22:15.003 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.022s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:22:15 compute-1 nova_compute[183403]: 2026-01-26 15:22:15.004 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5842MB free_disk=73.14486312866211GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:22:15 compute-1 nova_compute[183403]: 2026-01-26 15:22:15.004 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:22:15 compute-1 nova_compute[183403]: 2026-01-26 15:22:15.005 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:22:15 compute-1 nova_compute[183403]: 2026-01-26 15:22:15.261 183407 DEBUG nova.compute.manager [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Jan 26 15:22:15 compute-1 nova_compute[183403]: 2026-01-26 15:22:15.262 183407 DEBUG nova.network.neutron [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Jan 26 15:22:15 compute-1 nova_compute[183403]: 2026-01-26 15:22:15.263 183407 WARNING neutronclient.v2_0.client [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:22:15 compute-1 nova_compute[183403]: 2026-01-26 15:22:15.263 183407 WARNING neutronclient.v2_0.client [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:22:15 compute-1 nova_compute[183403]: 2026-01-26 15:22:15.773 183407 INFO nova.virt.libvirt.driver [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 15:22:16 compute-1 nova_compute[183403]: 2026-01-26 15:22:16.038 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Instance 67d3bb43-d956-4beb-8227-316914d585d8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 15:22:16 compute-1 nova_compute[183403]: 2026-01-26 15:22:16.039 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:22:16 compute-1 nova_compute[183403]: 2026-01-26 15:22:16.039 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:22:15 up  1:17,  0 user,  load average: 0.28, 0.32, 0.32\n', 'num_instances': '1', 'num_vm_building': '1', 'num_task_networking': '1', 'num_os_type_None': '1', 'num_proj_e07345fa9028494086d0d062e5c6d037': '1', 'io_workload': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:22:16 compute-1 nova_compute[183403]: 2026-01-26 15:22:16.044 183407 DEBUG nova.network.neutron [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Successfully created port: 2a825112-3789-4aab-b884-68a58a42a2fe _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Jan 26 15:22:16 compute-1 nova_compute[183403]: 2026-01-26 15:22:16.086 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:22:16 compute-1 nova_compute[183403]: 2026-01-26 15:22:16.282 183407 DEBUG nova.compute.manager [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Jan 26 15:22:16 compute-1 nova_compute[183403]: 2026-01-26 15:22:16.598 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:22:16 compute-1 nova_compute[183403]: 2026-01-26 15:22:16.847 183407 DEBUG nova.network.neutron [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Successfully updated port: 2a825112-3789-4aab-b884-68a58a42a2fe _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Jan 26 15:22:16 compute-1 nova_compute[183403]: 2026-01-26 15:22:16.893 183407 DEBUG nova.compute.manager [req-e72aaeca-8b40-4485-a6f9-a953924e0962 req-010db8d2-f2a6-4ffc-819d-efa99db070d5 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Received event network-changed-2a825112-3789-4aab-b884-68a58a42a2fe external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:22:16 compute-1 nova_compute[183403]: 2026-01-26 15:22:16.893 183407 DEBUG nova.compute.manager [req-e72aaeca-8b40-4485-a6f9-a953924e0962 req-010db8d2-f2a6-4ffc-819d-efa99db070d5 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Refreshing instance network info cache due to event network-changed-2a825112-3789-4aab-b884-68a58a42a2fe. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 26 15:22:16 compute-1 nova_compute[183403]: 2026-01-26 15:22:16.893 183407 DEBUG oslo_concurrency.lockutils [req-e72aaeca-8b40-4485-a6f9-a953924e0962 req-010db8d2-f2a6-4ffc-819d-efa99db070d5 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "refresh_cache-67d3bb43-d956-4beb-8227-316914d585d8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 15:22:16 compute-1 nova_compute[183403]: 2026-01-26 15:22:16.893 183407 DEBUG oslo_concurrency.lockutils [req-e72aaeca-8b40-4485-a6f9-a953924e0962 req-010db8d2-f2a6-4ffc-819d-efa99db070d5 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquired lock "refresh_cache-67d3bb43-d956-4beb-8227-316914d585d8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 15:22:16 compute-1 nova_compute[183403]: 2026-01-26 15:22:16.894 183407 DEBUG nova.network.neutron [req-e72aaeca-8b40-4485-a6f9-a953924e0962 req-010db8d2-f2a6-4ffc-819d-efa99db070d5 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Refreshing network info cache for port 2a825112-3789-4aab-b884-68a58a42a2fe _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 26 15:22:17 compute-1 nova_compute[183403]: 2026-01-26 15:22:17.111 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:22:17 compute-1 nova_compute[183403]: 2026-01-26 15:22:17.111 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.107s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:22:17 compute-1 nova_compute[183403]: 2026-01-26 15:22:17.298 183407 DEBUG nova.compute.manager [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Jan 26 15:22:17 compute-1 nova_compute[183403]: 2026-01-26 15:22:17.301 183407 DEBUG nova.virt.libvirt.driver [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Jan 26 15:22:17 compute-1 nova_compute[183403]: 2026-01-26 15:22:17.301 183407 INFO nova.virt.libvirt.driver [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Creating image(s)
Jan 26 15:22:17 compute-1 nova_compute[183403]: 2026-01-26 15:22:17.302 183407 DEBUG oslo_concurrency.lockutils [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Acquiring lock "/var/lib/nova/instances/67d3bb43-d956-4beb-8227-316914d585d8/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:22:17 compute-1 nova_compute[183403]: 2026-01-26 15:22:17.303 183407 DEBUG oslo_concurrency.lockutils [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Lock "/var/lib/nova/instances/67d3bb43-d956-4beb-8227-316914d585d8/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:22:17 compute-1 nova_compute[183403]: 2026-01-26 15:22:17.304 183407 DEBUG oslo_concurrency.lockutils [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Lock "/var/lib/nova/instances/67d3bb43-d956-4beb-8227-316914d585d8/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:22:17 compute-1 nova_compute[183403]: 2026-01-26 15:22:17.305 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:22:17 compute-1 nova_compute[183403]: 2026-01-26 15:22:17.312 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:22:17 compute-1 nova_compute[183403]: 2026-01-26 15:22:17.319 183407 DEBUG oslo_concurrency.processutils [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:22:17 compute-1 nova_compute[183403]: 2026-01-26 15:22:17.352 183407 DEBUG oslo_concurrency.lockutils [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Acquiring lock "refresh_cache-67d3bb43-d956-4beb-8227-316914d585d8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 15:22:17 compute-1 nova_compute[183403]: 2026-01-26 15:22:17.404 183407 WARNING neutronclient.v2_0.client [req-e72aaeca-8b40-4485-a6f9-a953924e0962 req-010db8d2-f2a6-4ffc-819d-efa99db070d5 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:22:17 compute-1 nova_compute[183403]: 2026-01-26 15:22:17.412 183407 DEBUG oslo_concurrency.processutils [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:22:17 compute-1 nova_compute[183403]: 2026-01-26 15:22:17.413 183407 DEBUG oslo_concurrency.lockutils [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Acquiring lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:22:17 compute-1 nova_compute[183403]: 2026-01-26 15:22:17.414 183407 DEBUG oslo_concurrency.lockutils [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:22:17 compute-1 nova_compute[183403]: 2026-01-26 15:22:17.415 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:22:17 compute-1 nova_compute[183403]: 2026-01-26 15:22:17.422 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:22:17 compute-1 nova_compute[183403]: 2026-01-26 15:22:17.422 183407 DEBUG oslo_concurrency.processutils [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:22:17 compute-1 nova_compute[183403]: 2026-01-26 15:22:17.512 183407 DEBUG oslo_concurrency.processutils [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:22:17 compute-1 nova_compute[183403]: 2026-01-26 15:22:17.514 183407 DEBUG oslo_concurrency.processutils [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0,backing_fmt=raw /var/lib/nova/instances/67d3bb43-d956-4beb-8227-316914d585d8/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:22:17 compute-1 nova_compute[183403]: 2026-01-26 15:22:17.556 183407 DEBUG oslo_concurrency.processutils [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0,backing_fmt=raw /var/lib/nova/instances/67d3bb43-d956-4beb-8227-316914d585d8/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:22:17 compute-1 nova_compute[183403]: 2026-01-26 15:22:17.557 183407 DEBUG oslo_concurrency.lockutils [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.143s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:22:17 compute-1 nova_compute[183403]: 2026-01-26 15:22:17.557 183407 DEBUG oslo_concurrency.processutils [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:22:17 compute-1 nova_compute[183403]: 2026-01-26 15:22:17.631 183407 DEBUG oslo_concurrency.processutils [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:22:17 compute-1 nova_compute[183403]: 2026-01-26 15:22:17.631 183407 DEBUG nova.virt.disk.api [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Checking if we can resize image /var/lib/nova/instances/67d3bb43-d956-4beb-8227-316914d585d8/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 26 15:22:17 compute-1 nova_compute[183403]: 2026-01-26 15:22:17.632 183407 DEBUG oslo_concurrency.processutils [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/67d3bb43-d956-4beb-8227-316914d585d8/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:22:17 compute-1 nova_compute[183403]: 2026-01-26 15:22:17.667 183407 DEBUG nova.network.neutron [req-e72aaeca-8b40-4485-a6f9-a953924e0962 req-010db8d2-f2a6-4ffc-819d-efa99db070d5 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 15:22:17 compute-1 nova_compute[183403]: 2026-01-26 15:22:17.691 183407 DEBUG oslo_concurrency.processutils [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/67d3bb43-d956-4beb-8227-316914d585d8/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:22:17 compute-1 nova_compute[183403]: 2026-01-26 15:22:17.691 183407 DEBUG nova.virt.disk.api [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Cannot resize image /var/lib/nova/instances/67d3bb43-d956-4beb-8227-316914d585d8/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 26 15:22:17 compute-1 nova_compute[183403]: 2026-01-26 15:22:17.692 183407 DEBUG nova.virt.libvirt.driver [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Jan 26 15:22:17 compute-1 nova_compute[183403]: 2026-01-26 15:22:17.692 183407 DEBUG nova.virt.libvirt.driver [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Ensure instance console log exists: /var/lib/nova/instances/67d3bb43-d956-4beb-8227-316914d585d8/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Jan 26 15:22:17 compute-1 nova_compute[183403]: 2026-01-26 15:22:17.692 183407 DEBUG oslo_concurrency.lockutils [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:22:17 compute-1 nova_compute[183403]: 2026-01-26 15:22:17.692 183407 DEBUG oslo_concurrency.lockutils [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:22:17 compute-1 nova_compute[183403]: 2026-01-26 15:22:17.692 183407 DEBUG oslo_concurrency.lockutils [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:22:17 compute-1 nova_compute[183403]: 2026-01-26 15:22:17.874 183407 DEBUG nova.network.neutron [req-e72aaeca-8b40-4485-a6f9-a953924e0962 req-010db8d2-f2a6-4ffc-819d-efa99db070d5 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:22:17 compute-1 podman[209799]: 2026-01-26 15:22:17.922561383 +0000 UTC m=+0.089294977 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 26 15:22:18 compute-1 podman[209798]: 2026-01-26 15:22:18.000251887 +0000 UTC m=+0.168285505 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller)
Jan 26 15:22:18 compute-1 nova_compute[183403]: 2026-01-26 15:22:18.382 183407 DEBUG oslo_concurrency.lockutils [req-e72aaeca-8b40-4485-a6f9-a953924e0962 req-010db8d2-f2a6-4ffc-819d-efa99db070d5 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Releasing lock "refresh_cache-67d3bb43-d956-4beb-8227-316914d585d8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 15:22:18 compute-1 nova_compute[183403]: 2026-01-26 15:22:18.383 183407 DEBUG oslo_concurrency.lockutils [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Acquired lock "refresh_cache-67d3bb43-d956-4beb-8227-316914d585d8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 15:22:18 compute-1 nova_compute[183403]: 2026-01-26 15:22:18.383 183407 DEBUG nova.network.neutron [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 15:22:19 compute-1 nova_compute[183403]: 2026-01-26 15:22:19.112 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:22:19 compute-1 nova_compute[183403]: 2026-01-26 15:22:19.113 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:22:19 compute-1 nova_compute[183403]: 2026-01-26 15:22:19.113 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:22:19 compute-1 nova_compute[183403]: 2026-01-26 15:22:19.114 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:22:19 compute-1 nova_compute[183403]: 2026-01-26 15:22:19.114 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:22:19 compute-1 nova_compute[183403]: 2026-01-26 15:22:19.114 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 15:22:19 compute-1 openstack_network_exporter[195610]: ERROR   15:22:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:22:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:22:19 compute-1 openstack_network_exporter[195610]: ERROR   15:22:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:22:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:22:19 compute-1 nova_compute[183403]: 2026-01-26 15:22:19.514 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:22:19 compute-1 nova_compute[183403]: 2026-01-26 15:22:19.572 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:22:19 compute-1 nova_compute[183403]: 2026-01-26 15:22:19.926 183407 DEBUG nova.network.neutron [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 15:22:20 compute-1 nova_compute[183403]: 2026-01-26 15:22:20.004 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:22:20 compute-1 nova_compute[183403]: 2026-01-26 15:22:20.204 183407 WARNING neutronclient.v2_0.client [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:22:20 compute-1 nova_compute[183403]: 2026-01-26 15:22:20.383 183407 DEBUG nova.network.neutron [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Updating instance_info_cache with network_info: [{"id": "2a825112-3789-4aab-b884-68a58a42a2fe", "address": "fa:16:3e:16:67:c8", "network": {"id": "3e9dfa8e-4100-40c2-b5c3-611e27e3b601", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-854362176-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9147aacc62b34487b1727939f1a92703", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a825112-37", "ovs_interfaceid": "2a825112-3789-4aab-b884-68a58a42a2fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:22:20 compute-1 nova_compute[183403]: 2026-01-26 15:22:20.896 183407 DEBUG oslo_concurrency.lockutils [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Releasing lock "refresh_cache-67d3bb43-d956-4beb-8227-316914d585d8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 15:22:20 compute-1 nova_compute[183403]: 2026-01-26 15:22:20.897 183407 DEBUG nova.compute.manager [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Instance network_info: |[{"id": "2a825112-3789-4aab-b884-68a58a42a2fe", "address": "fa:16:3e:16:67:c8", "network": {"id": "3e9dfa8e-4100-40c2-b5c3-611e27e3b601", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-854362176-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9147aacc62b34487b1727939f1a92703", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a825112-37", "ovs_interfaceid": "2a825112-3789-4aab-b884-68a58a42a2fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Jan 26 15:22:20 compute-1 nova_compute[183403]: 2026-01-26 15:22:20.901 183407 DEBUG nova.virt.libvirt.driver [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Start _get_guest_xml network_info=[{"id": "2a825112-3789-4aab-b884-68a58a42a2fe", "address": "fa:16:3e:16:67:c8", "network": {"id": "3e9dfa8e-4100-40c2-b5c3-611e27e3b601", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-854362176-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9147aacc62b34487b1727939f1a92703", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a825112-37", "ovs_interfaceid": "2a825112-3789-4aab-b884-68a58a42a2fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T15:01:22Z,direct_url=<?>,disk_format='qcow2',id=354e4d0e-4287-404f-93d3-2c85cfe92fbc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='179f3c996d8f4e7ea1b0aca3ec76f02e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T15:01:23Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'image_id': '354e4d0e-4287-404f-93d3-2c85cfe92fbc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Jan 26 15:22:20 compute-1 nova_compute[183403]: 2026-01-26 15:22:20.907 183407 WARNING nova.virt.libvirt.driver [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:22:20 compute-1 nova_compute[183403]: 2026-01-26 15:22:20.909 183407 DEBUG nova.virt.driver [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='354e4d0e-4287-404f-93d3-2c85cfe92fbc', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-29266541', uuid='67d3bb43-d956-4beb-8227-316914d585d8'), owner=OwnerMeta(userid='0c77b3ed882642e3b0c7840dc8efc49a', username='tempest-TestExecuteNodeResourceConsolidationStrategy-1702173696-project-admin', projectid='e07345fa9028494086d0d062e5c6d037', projectname='tempest-TestExecuteNodeResourceConsolidationStrategy-1702173696'), image=ImageMeta(id='354e4d0e-4287-404f-93d3-2c85cfe92fbc', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='74480e15-23e6-4569-8ef9-3ddf5ac8b981', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "2a825112-3789-4aab-b884-68a58a42a2fe", "address": "fa:16:3e:16:67:c8", "network": {"id": "3e9dfa8e-4100-40c2-b5c3-611e27e3b601", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-854362176-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9147aacc62b34487b1727939f1a92703", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a825112-37", "ovs_interfaceid": "2a825112-3789-4aab-b884-68a58a42a2fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1769440940.9092898) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Jan 26 15:22:20 compute-1 nova_compute[183403]: 2026-01-26 15:22:20.917 183407 DEBUG nova.virt.libvirt.host [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Jan 26 15:22:20 compute-1 nova_compute[183403]: 2026-01-26 15:22:20.918 183407 DEBUG nova.virt.libvirt.host [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Jan 26 15:22:20 compute-1 nova_compute[183403]: 2026-01-26 15:22:20.921 183407 DEBUG nova.virt.libvirt.host [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Jan 26 15:22:20 compute-1 nova_compute[183403]: 2026-01-26 15:22:20.922 183407 DEBUG nova.virt.libvirt.host [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Jan 26 15:22:20 compute-1 nova_compute[183403]: 2026-01-26 15:22:20.924 183407 DEBUG nova.virt.libvirt.driver [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Jan 26 15:22:20 compute-1 nova_compute[183403]: 2026-01-26 15:22:20.924 183407 DEBUG nova.virt.hardware [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T15:01:21Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='74480e15-23e6-4569-8ef9-3ddf5ac8b981',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T15:01:22Z,direct_url=<?>,disk_format='qcow2',id=354e4d0e-4287-404f-93d3-2c85cfe92fbc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='179f3c996d8f4e7ea1b0aca3ec76f02e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T15:01:23Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Jan 26 15:22:20 compute-1 nova_compute[183403]: 2026-01-26 15:22:20.925 183407 DEBUG nova.virt.hardware [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Jan 26 15:22:20 compute-1 nova_compute[183403]: 2026-01-26 15:22:20.925 183407 DEBUG nova.virt.hardware [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Jan 26 15:22:20 compute-1 nova_compute[183403]: 2026-01-26 15:22:20.926 183407 DEBUG nova.virt.hardware [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Jan 26 15:22:20 compute-1 nova_compute[183403]: 2026-01-26 15:22:20.926 183407 DEBUG nova.virt.hardware [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Jan 26 15:22:20 compute-1 nova_compute[183403]: 2026-01-26 15:22:20.926 183407 DEBUG nova.virt.hardware [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Jan 26 15:22:20 compute-1 nova_compute[183403]: 2026-01-26 15:22:20.927 183407 DEBUG nova.virt.hardware [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Jan 26 15:22:20 compute-1 nova_compute[183403]: 2026-01-26 15:22:20.927 183407 DEBUG nova.virt.hardware [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Jan 26 15:22:20 compute-1 nova_compute[183403]: 2026-01-26 15:22:20.927 183407 DEBUG nova.virt.hardware [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Jan 26 15:22:20 compute-1 nova_compute[183403]: 2026-01-26 15:22:20.928 183407 DEBUG nova.virt.hardware [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Jan 26 15:22:20 compute-1 nova_compute[183403]: 2026-01-26 15:22:20.928 183407 DEBUG nova.virt.hardware [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Jan 26 15:22:20 compute-1 nova_compute[183403]: 2026-01-26 15:22:20.934 183407 DEBUG nova.virt.libvirt.vif [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-01-26T15:22:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-29266541',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-292',id=17,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e07345fa9028494086d0d062e5c6d037',ramdisk_id='',reservation_id='r-s8t7aae9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1702173696',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1702173696-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T15:22:16Z,user_data=None,user_id='0c77b3ed882642e3b0c7840dc8efc49a',uuid=67d3bb43-d956-4beb-8227-316914d585d8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2a825112-3789-4aab-b884-68a58a42a2fe", "address": "fa:16:3e:16:67:c8", "network": {"id": "3e9dfa8e-4100-40c2-b5c3-611e27e3b601", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-854362176-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9147aacc62b34487b1727939f1a92703", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a825112-37", "ovs_interfaceid": "2a825112-3789-4aab-b884-68a58a42a2fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 26 15:22:20 compute-1 nova_compute[183403]: 2026-01-26 15:22:20.935 183407 DEBUG nova.network.os_vif_util [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Converting VIF {"id": "2a825112-3789-4aab-b884-68a58a42a2fe", "address": "fa:16:3e:16:67:c8", "network": {"id": "3e9dfa8e-4100-40c2-b5c3-611e27e3b601", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-854362176-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9147aacc62b34487b1727939f1a92703", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a825112-37", "ovs_interfaceid": "2a825112-3789-4aab-b884-68a58a42a2fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:22:20 compute-1 nova_compute[183403]: 2026-01-26 15:22:20.936 183407 DEBUG nova.network.os_vif_util [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:16:67:c8,bridge_name='br-int',has_traffic_filtering=True,id=2a825112-3789-4aab-b884-68a58a42a2fe,network=Network(3e9dfa8e-4100-40c2-b5c3-611e27e3b601),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a825112-37') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:22:20 compute-1 nova_compute[183403]: 2026-01-26 15:22:20.937 183407 DEBUG nova.objects.instance [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Lazy-loading 'pci_devices' on Instance uuid 67d3bb43-d956-4beb-8227-316914d585d8 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:22:21 compute-1 nova_compute[183403]: 2026-01-26 15:22:21.444 183407 DEBUG nova.virt.libvirt.driver [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] End _get_guest_xml xml=<domain type="kvm">
Jan 26 15:22:21 compute-1 nova_compute[183403]:   <uuid>67d3bb43-d956-4beb-8227-316914d585d8</uuid>
Jan 26 15:22:21 compute-1 nova_compute[183403]:   <name>instance-00000011</name>
Jan 26 15:22:21 compute-1 nova_compute[183403]:   <memory>131072</memory>
Jan 26 15:22:21 compute-1 nova_compute[183403]:   <vcpu>1</vcpu>
Jan 26 15:22:21 compute-1 nova_compute[183403]:   <metadata>
Jan 26 15:22:21 compute-1 nova_compute[183403]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 15:22:21 compute-1 nova_compute[183403]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 15:22:21 compute-1 nova_compute[183403]:       <nova:name>tempest-TestExecuteNodeResourceConsolidationStrategy-server-29266541</nova:name>
Jan 26 15:22:21 compute-1 nova_compute[183403]:       <nova:creationTime>2026-01-26 15:22:20</nova:creationTime>
Jan 26 15:22:21 compute-1 nova_compute[183403]:       <nova:flavor name="m1.nano" id="74480e15-23e6-4569-8ef9-3ddf5ac8b981">
Jan 26 15:22:21 compute-1 nova_compute[183403]:         <nova:memory>128</nova:memory>
Jan 26 15:22:21 compute-1 nova_compute[183403]:         <nova:disk>1</nova:disk>
Jan 26 15:22:21 compute-1 nova_compute[183403]:         <nova:swap>0</nova:swap>
Jan 26 15:22:21 compute-1 nova_compute[183403]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 15:22:21 compute-1 nova_compute[183403]:         <nova:vcpus>1</nova:vcpus>
Jan 26 15:22:21 compute-1 nova_compute[183403]:         <nova:extraSpecs>
Jan 26 15:22:21 compute-1 nova_compute[183403]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 15:22:21 compute-1 nova_compute[183403]:         </nova:extraSpecs>
Jan 26 15:22:21 compute-1 nova_compute[183403]:       </nova:flavor>
Jan 26 15:22:21 compute-1 nova_compute[183403]:       <nova:image uuid="354e4d0e-4287-404f-93d3-2c85cfe92fbc">
Jan 26 15:22:21 compute-1 nova_compute[183403]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 15:22:21 compute-1 nova_compute[183403]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 15:22:21 compute-1 nova_compute[183403]:         <nova:minDisk>1</nova:minDisk>
Jan 26 15:22:21 compute-1 nova_compute[183403]:         <nova:minRam>0</nova:minRam>
Jan 26 15:22:21 compute-1 nova_compute[183403]:         <nova:properties>
Jan 26 15:22:21 compute-1 nova_compute[183403]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 15:22:21 compute-1 nova_compute[183403]:         </nova:properties>
Jan 26 15:22:21 compute-1 nova_compute[183403]:       </nova:image>
Jan 26 15:22:21 compute-1 nova_compute[183403]:       <nova:owner>
Jan 26 15:22:21 compute-1 nova_compute[183403]:         <nova:user uuid="0c77b3ed882642e3b0c7840dc8efc49a">tempest-TestExecuteNodeResourceConsolidationStrategy-1702173696-project-admin</nova:user>
Jan 26 15:22:21 compute-1 nova_compute[183403]:         <nova:project uuid="e07345fa9028494086d0d062e5c6d037">tempest-TestExecuteNodeResourceConsolidationStrategy-1702173696</nova:project>
Jan 26 15:22:21 compute-1 nova_compute[183403]:       </nova:owner>
Jan 26 15:22:21 compute-1 nova_compute[183403]:       <nova:root type="image" uuid="354e4d0e-4287-404f-93d3-2c85cfe92fbc"/>
Jan 26 15:22:21 compute-1 nova_compute[183403]:       <nova:ports>
Jan 26 15:22:21 compute-1 nova_compute[183403]:         <nova:port uuid="2a825112-3789-4aab-b884-68a58a42a2fe">
Jan 26 15:22:21 compute-1 nova_compute[183403]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 26 15:22:21 compute-1 nova_compute[183403]:         </nova:port>
Jan 26 15:22:21 compute-1 nova_compute[183403]:       </nova:ports>
Jan 26 15:22:21 compute-1 nova_compute[183403]:     </nova:instance>
Jan 26 15:22:21 compute-1 nova_compute[183403]:   </metadata>
Jan 26 15:22:21 compute-1 nova_compute[183403]:   <sysinfo type="smbios">
Jan 26 15:22:21 compute-1 nova_compute[183403]:     <system>
Jan 26 15:22:21 compute-1 nova_compute[183403]:       <entry name="manufacturer">RDO</entry>
Jan 26 15:22:21 compute-1 nova_compute[183403]:       <entry name="product">OpenStack Compute</entry>
Jan 26 15:22:21 compute-1 nova_compute[183403]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 15:22:21 compute-1 nova_compute[183403]:       <entry name="serial">67d3bb43-d956-4beb-8227-316914d585d8</entry>
Jan 26 15:22:21 compute-1 nova_compute[183403]:       <entry name="uuid">67d3bb43-d956-4beb-8227-316914d585d8</entry>
Jan 26 15:22:21 compute-1 nova_compute[183403]:       <entry name="family">Virtual Machine</entry>
Jan 26 15:22:21 compute-1 nova_compute[183403]:     </system>
Jan 26 15:22:21 compute-1 nova_compute[183403]:   </sysinfo>
Jan 26 15:22:21 compute-1 nova_compute[183403]:   <os>
Jan 26 15:22:21 compute-1 nova_compute[183403]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 15:22:21 compute-1 nova_compute[183403]:     <boot dev="hd"/>
Jan 26 15:22:21 compute-1 nova_compute[183403]:     <smbios mode="sysinfo"/>
Jan 26 15:22:21 compute-1 nova_compute[183403]:   </os>
Jan 26 15:22:21 compute-1 nova_compute[183403]:   <features>
Jan 26 15:22:21 compute-1 nova_compute[183403]:     <acpi/>
Jan 26 15:22:21 compute-1 nova_compute[183403]:     <apic/>
Jan 26 15:22:21 compute-1 nova_compute[183403]:     <vmcoreinfo/>
Jan 26 15:22:21 compute-1 nova_compute[183403]:   </features>
Jan 26 15:22:21 compute-1 nova_compute[183403]:   <clock offset="utc">
Jan 26 15:22:21 compute-1 nova_compute[183403]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 15:22:21 compute-1 nova_compute[183403]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 15:22:21 compute-1 nova_compute[183403]:     <timer name="hpet" present="no"/>
Jan 26 15:22:21 compute-1 nova_compute[183403]:   </clock>
Jan 26 15:22:21 compute-1 nova_compute[183403]:   <cpu mode="custom" match="exact">
Jan 26 15:22:21 compute-1 nova_compute[183403]:     <model>Nehalem</model>
Jan 26 15:22:21 compute-1 nova_compute[183403]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 15:22:21 compute-1 nova_compute[183403]:   </cpu>
Jan 26 15:22:21 compute-1 nova_compute[183403]:   <devices>
Jan 26 15:22:21 compute-1 nova_compute[183403]:     <disk type="file" device="disk">
Jan 26 15:22:21 compute-1 nova_compute[183403]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 15:22:21 compute-1 nova_compute[183403]:       <source file="/var/lib/nova/instances/67d3bb43-d956-4beb-8227-316914d585d8/disk"/>
Jan 26 15:22:21 compute-1 nova_compute[183403]:       <target dev="vda" bus="virtio"/>
Jan 26 15:22:21 compute-1 nova_compute[183403]:     </disk>
Jan 26 15:22:21 compute-1 nova_compute[183403]:     <disk type="file" device="cdrom">
Jan 26 15:22:21 compute-1 nova_compute[183403]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 15:22:21 compute-1 nova_compute[183403]:       <source file="/var/lib/nova/instances/67d3bb43-d956-4beb-8227-316914d585d8/disk.config"/>
Jan 26 15:22:21 compute-1 nova_compute[183403]:       <target dev="sda" bus="sata"/>
Jan 26 15:22:21 compute-1 nova_compute[183403]:     </disk>
Jan 26 15:22:21 compute-1 nova_compute[183403]:     <interface type="ethernet">
Jan 26 15:22:21 compute-1 nova_compute[183403]:       <mac address="fa:16:3e:16:67:c8"/>
Jan 26 15:22:21 compute-1 nova_compute[183403]:       <model type="virtio"/>
Jan 26 15:22:21 compute-1 nova_compute[183403]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 15:22:21 compute-1 nova_compute[183403]:       <mtu size="1442"/>
Jan 26 15:22:21 compute-1 nova_compute[183403]:       <target dev="tap2a825112-37"/>
Jan 26 15:22:21 compute-1 nova_compute[183403]:     </interface>
Jan 26 15:22:21 compute-1 nova_compute[183403]:     <serial type="pty">
Jan 26 15:22:21 compute-1 nova_compute[183403]:       <log file="/var/lib/nova/instances/67d3bb43-d956-4beb-8227-316914d585d8/console.log" append="off"/>
Jan 26 15:22:21 compute-1 nova_compute[183403]:     </serial>
Jan 26 15:22:21 compute-1 nova_compute[183403]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 15:22:21 compute-1 nova_compute[183403]:     <video>
Jan 26 15:22:21 compute-1 nova_compute[183403]:       <model type="virtio"/>
Jan 26 15:22:21 compute-1 nova_compute[183403]:     </video>
Jan 26 15:22:21 compute-1 nova_compute[183403]:     <input type="tablet" bus="usb"/>
Jan 26 15:22:21 compute-1 nova_compute[183403]:     <rng model="virtio">
Jan 26 15:22:21 compute-1 nova_compute[183403]:       <backend model="random">/dev/urandom</backend>
Jan 26 15:22:21 compute-1 nova_compute[183403]:     </rng>
Jan 26 15:22:21 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root"/>
Jan 26 15:22:21 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:22:21 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:22:21 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:22:21 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:22:21 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:22:21 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:22:21 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:22:21 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:22:21 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:22:21 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:22:21 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:22:21 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:22:21 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:22:21 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:22:21 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:22:21 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:22:21 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:22:21 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:22:21 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:22:21 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:22:21 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:22:21 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:22:21 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:22:21 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:22:21 compute-1 nova_compute[183403]:     <controller type="usb" index="0"/>
Jan 26 15:22:21 compute-1 nova_compute[183403]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 15:22:21 compute-1 nova_compute[183403]:       <stats period="10"/>
Jan 26 15:22:21 compute-1 nova_compute[183403]:     </memballoon>
Jan 26 15:22:21 compute-1 nova_compute[183403]:   </devices>
Jan 26 15:22:21 compute-1 nova_compute[183403]: </domain>
Jan 26 15:22:21 compute-1 nova_compute[183403]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Jan 26 15:22:21 compute-1 nova_compute[183403]: 2026-01-26 15:22:21.445 183407 DEBUG nova.compute.manager [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Preparing to wait for external event network-vif-plugged-2a825112-3789-4aab-b884-68a58a42a2fe prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 26 15:22:21 compute-1 nova_compute[183403]: 2026-01-26 15:22:21.445 183407 DEBUG oslo_concurrency.lockutils [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Acquiring lock "67d3bb43-d956-4beb-8227-316914d585d8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:22:21 compute-1 nova_compute[183403]: 2026-01-26 15:22:21.445 183407 DEBUG oslo_concurrency.lockutils [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Lock "67d3bb43-d956-4beb-8227-316914d585d8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:22:21 compute-1 nova_compute[183403]: 2026-01-26 15:22:21.445 183407 DEBUG oslo_concurrency.lockutils [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Lock "67d3bb43-d956-4beb-8227-316914d585d8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:22:21 compute-1 nova_compute[183403]: 2026-01-26 15:22:21.446 183407 DEBUG nova.virt.libvirt.vif [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-01-26T15:22:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-29266541',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-292',id=17,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e07345fa9028494086d0d062e5c6d037',ramdisk_id='',reservation_id='r-s8t7aae9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1702173696',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1702173696-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T15:22:16Z,user_data=None,user_id='0c77b3ed882642e3b0c7840dc8efc49a',uuid=67d3bb43-d956-4beb-8227-316914d585d8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2a825112-3789-4aab-b884-68a58a42a2fe", "address": "fa:16:3e:16:67:c8", "network": {"id": "3e9dfa8e-4100-40c2-b5c3-611e27e3b601", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-854362176-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9147aacc62b34487b1727939f1a92703", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a825112-37", "ovs_interfaceid": "2a825112-3789-4aab-b884-68a58a42a2fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 26 15:22:21 compute-1 nova_compute[183403]: 2026-01-26 15:22:21.446 183407 DEBUG nova.network.os_vif_util [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Converting VIF {"id": "2a825112-3789-4aab-b884-68a58a42a2fe", "address": "fa:16:3e:16:67:c8", "network": {"id": "3e9dfa8e-4100-40c2-b5c3-611e27e3b601", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-854362176-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9147aacc62b34487b1727939f1a92703", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a825112-37", "ovs_interfaceid": "2a825112-3789-4aab-b884-68a58a42a2fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:22:21 compute-1 nova_compute[183403]: 2026-01-26 15:22:21.447 183407 DEBUG nova.network.os_vif_util [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:16:67:c8,bridge_name='br-int',has_traffic_filtering=True,id=2a825112-3789-4aab-b884-68a58a42a2fe,network=Network(3e9dfa8e-4100-40c2-b5c3-611e27e3b601),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a825112-37') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:22:21 compute-1 nova_compute[183403]: 2026-01-26 15:22:21.447 183407 DEBUG os_vif [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:16:67:c8,bridge_name='br-int',has_traffic_filtering=True,id=2a825112-3789-4aab-b884-68a58a42a2fe,network=Network(3e9dfa8e-4100-40c2-b5c3-611e27e3b601),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a825112-37') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 26 15:22:21 compute-1 nova_compute[183403]: 2026-01-26 15:22:21.448 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:22:21 compute-1 nova_compute[183403]: 2026-01-26 15:22:21.448 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:22:21 compute-1 nova_compute[183403]: 2026-01-26 15:22:21.448 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:22:21 compute-1 nova_compute[183403]: 2026-01-26 15:22:21.449 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:22:21 compute-1 nova_compute[183403]: 2026-01-26 15:22:21.449 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '964406bf-360f-5b8a-998d-3a1fdf16fb2b', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:22:21 compute-1 nova_compute[183403]: 2026-01-26 15:22:21.485 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:22:21 compute-1 nova_compute[183403]: 2026-01-26 15:22:21.486 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:22:21 compute-1 nova_compute[183403]: 2026-01-26 15:22:21.488 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:22:21 compute-1 nova_compute[183403]: 2026-01-26 15:22:21.488 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2a825112-37, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:22:21 compute-1 nova_compute[183403]: 2026-01-26 15:22:21.489 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap2a825112-37, col_values=(('qos', UUID('67b2c6f9-7a9a-4130-bf0f-14ba84dde7c4')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:22:21 compute-1 nova_compute[183403]: 2026-01-26 15:22:21.489 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap2a825112-37, col_values=(('external_ids', {'iface-id': '2a825112-3789-4aab-b884-68a58a42a2fe', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:16:67:c8', 'vm-uuid': '67d3bb43-d956-4beb-8227-316914d585d8'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:22:21 compute-1 nova_compute[183403]: 2026-01-26 15:22:21.490 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:22:21 compute-1 NetworkManager[55716]: <info>  [1769440941.4912] manager: (tap2a825112-37): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/55)
Jan 26 15:22:21 compute-1 nova_compute[183403]: 2026-01-26 15:22:21.492 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:22:21 compute-1 nova_compute[183403]: 2026-01-26 15:22:21.497 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:22:21 compute-1 nova_compute[183403]: 2026-01-26 15:22:21.497 183407 INFO os_vif [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:16:67:c8,bridge_name='br-int',has_traffic_filtering=True,id=2a825112-3789-4aab-b884-68a58a42a2fe,network=Network(3e9dfa8e-4100-40c2-b5c3-611e27e3b601),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a825112-37')
Jan 26 15:22:23 compute-1 nova_compute[183403]: 2026-01-26 15:22:23.036 183407 DEBUG nova.virt.libvirt.driver [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 15:22:23 compute-1 nova_compute[183403]: 2026-01-26 15:22:23.037 183407 DEBUG nova.virt.libvirt.driver [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 15:22:23 compute-1 nova_compute[183403]: 2026-01-26 15:22:23.037 183407 DEBUG nova.virt.libvirt.driver [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] No VIF found with MAC fa:16:3e:16:67:c8, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Jan 26 15:22:23 compute-1 nova_compute[183403]: 2026-01-26 15:22:23.037 183407 INFO nova.virt.libvirt.driver [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Using config drive
Jan 26 15:22:23 compute-1 nova_compute[183403]: 2026-01-26 15:22:23.550 183407 WARNING neutronclient.v2_0.client [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:22:24 compute-1 nova_compute[183403]: 2026-01-26 15:22:24.036 183407 INFO nova.virt.libvirt.driver [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Creating config drive at /var/lib/nova/instances/67d3bb43-d956-4beb-8227-316914d585d8/disk.config
Jan 26 15:22:24 compute-1 nova_compute[183403]: 2026-01-26 15:22:24.041 183407 DEBUG oslo_concurrency.processutils [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/67d3bb43-d956-4beb-8227-316914d585d8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpprpoiro9 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:22:24 compute-1 nova_compute[183403]: 2026-01-26 15:22:24.165 183407 DEBUG oslo_concurrency.processutils [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/67d3bb43-d956-4beb-8227-316914d585d8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpprpoiro9" returned: 0 in 0.124s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:22:24 compute-1 kernel: tap2a825112-37: entered promiscuous mode
Jan 26 15:22:24 compute-1 NetworkManager[55716]: <info>  [1769440944.2482] manager: (tap2a825112-37): new Tun device (/org/freedesktop/NetworkManager/Devices/56)
Jan 26 15:22:24 compute-1 ovn_controller[95641]: 2026-01-26T15:22:24Z|00143|binding|INFO|Claiming lport 2a825112-3789-4aab-b884-68a58a42a2fe for this chassis.
Jan 26 15:22:24 compute-1 ovn_controller[95641]: 2026-01-26T15:22:24Z|00144|binding|INFO|2a825112-3789-4aab-b884-68a58a42a2fe: Claiming fa:16:3e:16:67:c8 10.100.0.3
Jan 26 15:22:24 compute-1 nova_compute[183403]: 2026-01-26 15:22:24.285 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:22:24 compute-1 nova_compute[183403]: 2026-01-26 15:22:24.292 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:22:24.302 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:16:67:c8 10.100.0.3'], port_security=['fa:16:3e:16:67:c8 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '67d3bb43-d956-4beb-8227-316914d585d8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3e9dfa8e-4100-40c2-b5c3-611e27e3b601', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e07345fa9028494086d0d062e5c6d037', 'neutron:revision_number': '4', 'neutron:security_group_ids': '965d4ca4-c12d-4e4f-bc5a-34d61b5f1699', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9d8c1c3b-77a7-4237-87f3-33f734737e5d, chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], logical_port=2a825112-3789-4aab-b884-68a58a42a2fe) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:22:24.303 104930 INFO neutron.agent.ovn.metadata.agent [-] Port 2a825112-3789-4aab-b884-68a58a42a2fe in datapath 3e9dfa8e-4100-40c2-b5c3-611e27e3b601 bound to our chassis
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:22:24.304 104930 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3e9dfa8e-4100-40c2-b5c3-611e27e3b601
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:22:24.326 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[bd604e53-a435-48f4-8adc-766996264cc2]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:22:24.327 104930 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3e9dfa8e-41 in ovnmeta-3e9dfa8e-4100-40c2-b5c3-611e27e3b601 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Jan 26 15:22:24 compute-1 systemd-machined[154697]: New machine qemu-13-instance-00000011.
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:22:24.331 203506 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3e9dfa8e-40 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:22:24.331 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[28b97aeb-b73c-4ad9-8f4c-92ba09749820]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:22:24.332 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[024d8c23-9ec0-4fe4-8ff3-4f8c5fe47425]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:22:24.350 105448 DEBUG oslo.privsep.daemon [-] privsep: reply[a9174e54-003c-4c7c-9116-0446e03d93eb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:22:24 compute-1 nova_compute[183403]: 2026-01-26 15:22:24.364 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:22:24 compute-1 systemd[1]: Started Virtual Machine qemu-13-instance-00000011.
Jan 26 15:22:24 compute-1 ovn_controller[95641]: 2026-01-26T15:22:24Z|00145|binding|INFO|Setting lport 2a825112-3789-4aab-b884-68a58a42a2fe ovn-installed in OVS
Jan 26 15:22:24 compute-1 ovn_controller[95641]: 2026-01-26T15:22:24Z|00146|binding|INFO|Setting lport 2a825112-3789-4aab-b884-68a58a42a2fe up in Southbound
Jan 26 15:22:24 compute-1 nova_compute[183403]: 2026-01-26 15:22:24.369 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:22:24.372 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[9aa6adbb-f510-4217-a083-a92cc15d79a6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:22:24 compute-1 systemd-udevd[209866]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 15:22:24 compute-1 NetworkManager[55716]: <info>  [1769440944.3931] device (tap2a825112-37): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 15:22:24 compute-1 NetworkManager[55716]: <info>  [1769440944.3949] device (tap2a825112-37): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:22:24.416 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[574951af-185a-4be4-bd9c-f3005b2819e4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:22:24 compute-1 systemd-udevd[209870]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:22:24.423 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[73fe3527-4291-4f77-af93-6baafa2d46cb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:22:24 compute-1 NetworkManager[55716]: <info>  [1769440944.4244] manager: (tap3e9dfa8e-40): new Veth device (/org/freedesktop/NetworkManager/Devices/57)
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:22:24.472 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[f61dd863-3923-45a7-a779-830e315de172]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:22:24.476 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[38cdfd45-5441-4802-b0fc-4625697ae0db]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:22:24 compute-1 NetworkManager[55716]: <info>  [1769440944.5139] device (tap3e9dfa8e-40): carrier: link connected
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:22:24.524 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[fec9cc79-c0b9-4e90-a811-4dccc5e5376b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:22:24 compute-1 nova_compute[183403]: 2026-01-26 15:22:24.525 183407 DEBUG nova.compute.manager [req-b0a2326c-d9dd-4142-9447-f3397984530c req-6a20f842-96d5-4c7e-a846-ea1838e9817e 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Received event network-vif-plugged-2a825112-3789-4aab-b884-68a58a42a2fe external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:22:24 compute-1 nova_compute[183403]: 2026-01-26 15:22:24.525 183407 DEBUG oslo_concurrency.lockutils [req-b0a2326c-d9dd-4142-9447-f3397984530c req-6a20f842-96d5-4c7e-a846-ea1838e9817e 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "67d3bb43-d956-4beb-8227-316914d585d8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:22:24 compute-1 nova_compute[183403]: 2026-01-26 15:22:24.526 183407 DEBUG oslo_concurrency.lockutils [req-b0a2326c-d9dd-4142-9447-f3397984530c req-6a20f842-96d5-4c7e-a846-ea1838e9817e 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "67d3bb43-d956-4beb-8227-316914d585d8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:22:24 compute-1 nova_compute[183403]: 2026-01-26 15:22:24.526 183407 DEBUG oslo_concurrency.lockutils [req-b0a2326c-d9dd-4142-9447-f3397984530c req-6a20f842-96d5-4c7e-a846-ea1838e9817e 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "67d3bb43-d956-4beb-8227-316914d585d8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:22:24 compute-1 nova_compute[183403]: 2026-01-26 15:22:24.526 183407 DEBUG nova.compute.manager [req-b0a2326c-d9dd-4142-9447-f3397984530c req-6a20f842-96d5-4c7e-a846-ea1838e9817e 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Processing event network-vif-plugged-2a825112-3789-4aab-b884-68a58a42a2fe _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:22:24.553 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[95571e0f-369e-472f-a934-312ec789b68b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3e9dfa8e-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:63:8d:23'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 466711, 'reachable_time': 41054, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209896, 'error': None, 'target': 'ovnmeta-3e9dfa8e-4100-40c2-b5c3-611e27e3b601', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:22:24.578 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[3dde7cce-92cf-4180-8477-bcd0d3934f76]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe63:8d23'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 466711, 'tstamp': 466711}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209897, 'error': None, 'target': 'ovnmeta-3e9dfa8e-4100-40c2-b5c3-611e27e3b601', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:22:24.603 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[af8e167a-575f-482a-8454-dbb89c0e0d07]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3e9dfa8e-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:63:8d:23'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 466711, 'reachable_time': 41054, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 209898, 'error': None, 'target': 'ovnmeta-3e9dfa8e-4100-40c2-b5c3-611e27e3b601', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:22:24.647 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[dafa7c21-5b44-4230-a069-6275a2faa371]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:22:24 compute-1 nova_compute[183403]: 2026-01-26 15:22:24.718 183407 DEBUG nova.compute.manager [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 26 15:22:24 compute-1 nova_compute[183403]: 2026-01-26 15:22:24.723 183407 DEBUG nova.virt.libvirt.driver [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Jan 26 15:22:24 compute-1 nova_compute[183403]: 2026-01-26 15:22:24.727 183407 INFO nova.virt.libvirt.driver [-] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Instance spawned successfully.
Jan 26 15:22:24 compute-1 nova_compute[183403]: 2026-01-26 15:22:24.727 183407 DEBUG nova.virt.libvirt.driver [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:22:24.728 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[4950e878-7c4c-4e9d-838e-faf26c557b08]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:22:24.730 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3e9dfa8e-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:22:24.730 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:22:24.731 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3e9dfa8e-40, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:22:24 compute-1 NetworkManager[55716]: <info>  [1769440944.7345] manager: (tap3e9dfa8e-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/58)
Jan 26 15:22:24 compute-1 nova_compute[183403]: 2026-01-26 15:22:24.734 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:22:24 compute-1 kernel: tap3e9dfa8e-40: entered promiscuous mode
Jan 26 15:22:24 compute-1 nova_compute[183403]: 2026-01-26 15:22:24.738 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:22:24.738 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3e9dfa8e-40, col_values=(('external_ids', {'iface-id': '46bf4c6f-a210-4e32-9067-075b592fdafa'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:22:24 compute-1 ovn_controller[95641]: 2026-01-26T15:22:24Z|00147|binding|INFO|Releasing lport 46bf4c6f-a210-4e32-9067-075b592fdafa from this chassis (sb_readonly=0)
Jan 26 15:22:24 compute-1 nova_compute[183403]: 2026-01-26 15:22:24.763 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:22:24.764 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[e97abe99-1696-4461-a337-7ab9d96a5d0b]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:22:24.766 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3e9dfa8e-4100-40c2-b5c3-611e27e3b601.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3e9dfa8e-4100-40c2-b5c3-611e27e3b601.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:22:24.766 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3e9dfa8e-4100-40c2-b5c3-611e27e3b601.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3e9dfa8e-4100-40c2-b5c3-611e27e3b601.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:22:24.766 104930 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 3e9dfa8e-4100-40c2-b5c3-611e27e3b601 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:22:24.766 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3e9dfa8e-4100-40c2-b5c3-611e27e3b601.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3e9dfa8e-4100-40c2-b5c3-611e27e3b601.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:22:24.767 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[9a8b6171-3699-45ba-a1a5-743b17f1f19a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:22:24.768 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3e9dfa8e-4100-40c2-b5c3-611e27e3b601.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3e9dfa8e-4100-40c2-b5c3-611e27e3b601.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:22:24.768 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[bb57b827-d375-4420-a6e0-26b8fd96e2bd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:22:24.769 104930 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]: global
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]:     log         /dev/log local0 debug
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]:     log-tag     haproxy-metadata-proxy-3e9dfa8e-4100-40c2-b5c3-611e27e3b601
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]:     user        root
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]:     group       root
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]:     maxconn     1024
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]:     pidfile     /var/lib/neutron/external/pids/3e9dfa8e-4100-40c2-b5c3-611e27e3b601.pid.haproxy
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]:     daemon
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]: 
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]: defaults
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]:     log global
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]:     mode http
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]:     option httplog
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]:     option dontlognull
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]:     option http-server-close
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]:     option forwardfor
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]:     retries                 3
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]:     timeout http-request    30s
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]:     timeout connect         30s
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]:     timeout client          32s
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]:     timeout server          32s
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]:     timeout http-keep-alive 30s
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]: 
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]: listen listener
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]:     bind 169.254.169.254:80
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]:     
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]:     server metadata /var/lib/neutron/metadata_proxy
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]: 
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]:     http-request add-header X-OVN-Network-ID 3e9dfa8e-4100-40c2-b5c3-611e27e3b601
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Jan 26 15:22:24 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:22:24.770 104930 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3e9dfa8e-4100-40c2-b5c3-611e27e3b601', 'env', 'PROCESS_TAG=haproxy-3e9dfa8e-4100-40c2-b5c3-611e27e3b601', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3e9dfa8e-4100-40c2-b5c3-611e27e3b601.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Jan 26 15:22:25 compute-1 nova_compute[183403]: 2026-01-26 15:22:25.005 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:22:25 compute-1 podman[209937]: 2026-01-26 15:22:25.243136892 +0000 UTC m=+0.058528485 container create bce9f94fe13cc04861f9cb6271bef15812c5f5c5c6f2108df399d7a120f65aa4 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-3e9dfa8e-4100-40c2-b5c3-611e27e3b601, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260120, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Jan 26 15:22:25 compute-1 nova_compute[183403]: 2026-01-26 15:22:25.246 183407 DEBUG nova.virt.libvirt.driver [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:22:25 compute-1 nova_compute[183403]: 2026-01-26 15:22:25.247 183407 DEBUG nova.virt.libvirt.driver [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:22:25 compute-1 nova_compute[183403]: 2026-01-26 15:22:25.247 183407 DEBUG nova.virt.libvirt.driver [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:22:25 compute-1 nova_compute[183403]: 2026-01-26 15:22:25.248 183407 DEBUG nova.virt.libvirt.driver [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:22:25 compute-1 nova_compute[183403]: 2026-01-26 15:22:25.248 183407 DEBUG nova.virt.libvirt.driver [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:22:25 compute-1 nova_compute[183403]: 2026-01-26 15:22:25.249 183407 DEBUG nova.virt.libvirt.driver [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:22:25 compute-1 systemd[1]: Started libpod-conmon-bce9f94fe13cc04861f9cb6271bef15812c5f5c5c6f2108df399d7a120f65aa4.scope.
Jan 26 15:22:25 compute-1 podman[209937]: 2026-01-26 15:22:25.210054221 +0000 UTC m=+0.025445834 image pull d5bf96c5225682608353c2a38183b39c74c7c48343b54a579b3b6f3d81996637 38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Jan 26 15:22:25 compute-1 systemd[1]: Started libcrun container.
Jan 26 15:22:25 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c3d04bb7779bea1ca078dea6e2d4794f96074ac3bddb557e16c6dc3342c58890/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 15:22:25 compute-1 podman[209937]: 2026-01-26 15:22:25.556298962 +0000 UTC m=+0.371690595 container init bce9f94fe13cc04861f9cb6271bef15812c5f5c5c6f2108df399d7a120f65aa4 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-3e9dfa8e-4100-40c2-b5c3-611e27e3b601, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260120, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 15:22:25 compute-1 podman[209937]: 2026-01-26 15:22:25.564725462 +0000 UTC m=+0.380117045 container start bce9f94fe13cc04861f9cb6271bef15812c5f5c5c6f2108df399d7a120f65aa4 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-3e9dfa8e-4100-40c2-b5c3-611e27e3b601, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260120, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 26 15:22:25 compute-1 neutron-haproxy-ovnmeta-3e9dfa8e-4100-40c2-b5c3-611e27e3b601[209952]: [NOTICE]   (209956) : New worker (209958) forked
Jan 26 15:22:25 compute-1 neutron-haproxy-ovnmeta-3e9dfa8e-4100-40c2-b5c3-611e27e3b601[209952]: [NOTICE]   (209956) : Loading success.
Jan 26 15:22:25 compute-1 nova_compute[183403]: 2026-01-26 15:22:25.763 183407 INFO nova.compute.manager [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Took 8.46 seconds to spawn the instance on the hypervisor.
Jan 26 15:22:25 compute-1 nova_compute[183403]: 2026-01-26 15:22:25.764 183407 DEBUG nova.compute.manager [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Jan 26 15:22:26 compute-1 nova_compute[183403]: 2026-01-26 15:22:26.298 183407 INFO nova.compute.manager [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Took 13.91 seconds to build instance.
Jan 26 15:22:26 compute-1 nova_compute[183403]: 2026-01-26 15:22:26.490 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:22:26 compute-1 nova_compute[183403]: 2026-01-26 15:22:26.581 183407 DEBUG nova.compute.manager [req-8e542086-1670-4d34-8cff-8f54ac5fb073 req-e50679d0-983d-4384-aabc-a39bbffe8dac 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Received event network-vif-plugged-2a825112-3789-4aab-b884-68a58a42a2fe external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:22:26 compute-1 nova_compute[183403]: 2026-01-26 15:22:26.581 183407 DEBUG oslo_concurrency.lockutils [req-8e542086-1670-4d34-8cff-8f54ac5fb073 req-e50679d0-983d-4384-aabc-a39bbffe8dac 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "67d3bb43-d956-4beb-8227-316914d585d8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:22:26 compute-1 nova_compute[183403]: 2026-01-26 15:22:26.581 183407 DEBUG oslo_concurrency.lockutils [req-8e542086-1670-4d34-8cff-8f54ac5fb073 req-e50679d0-983d-4384-aabc-a39bbffe8dac 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "67d3bb43-d956-4beb-8227-316914d585d8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:22:26 compute-1 nova_compute[183403]: 2026-01-26 15:22:26.581 183407 DEBUG oslo_concurrency.lockutils [req-8e542086-1670-4d34-8cff-8f54ac5fb073 req-e50679d0-983d-4384-aabc-a39bbffe8dac 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "67d3bb43-d956-4beb-8227-316914d585d8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:22:26 compute-1 nova_compute[183403]: 2026-01-26 15:22:26.582 183407 DEBUG nova.compute.manager [req-8e542086-1670-4d34-8cff-8f54ac5fb073 req-e50679d0-983d-4384-aabc-a39bbffe8dac 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] No waiting events found dispatching network-vif-plugged-2a825112-3789-4aab-b884-68a58a42a2fe pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:22:26 compute-1 nova_compute[183403]: 2026-01-26 15:22:26.582 183407 WARNING nova.compute.manager [req-8e542086-1670-4d34-8cff-8f54ac5fb073 req-e50679d0-983d-4384-aabc-a39bbffe8dac 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Received unexpected event network-vif-plugged-2a825112-3789-4aab-b884-68a58a42a2fe for instance with vm_state active and task_state None.
Jan 26 15:22:26 compute-1 nova_compute[183403]: 2026-01-26 15:22:26.803 183407 DEBUG oslo_concurrency.lockutils [None req-3ab7fc5b-8a99-4e47-88d5-540efbdb73c6 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Lock "67d3bb43-d956-4beb-8227-316914d585d8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.437s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:22:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:22:29.058 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:22:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:22:29.059 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:22:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:22:29.059 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:22:30 compute-1 nova_compute[183403]: 2026-01-26 15:22:30.006 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:22:31 compute-1 nova_compute[183403]: 2026-01-26 15:22:31.492 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:22:34 compute-1 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 26 15:22:35 compute-1 nova_compute[183403]: 2026-01-26 15:22:35.008 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:22:35 compute-1 nova_compute[183403]: 2026-01-26 15:22:35.453 183407 DEBUG nova.compute.manager [None req-48a9b547-b57f-4182-a3ef-8cad54102811 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:636
Jan 26 15:22:35 compute-1 nova_compute[183403]: 2026-01-26 15:22:35.530 183407 DEBUG nova.compute.provider_tree [None req-48a9b547-b57f-4182-a3ef-8cad54102811 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Updating resource provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 generation from 20 to 21 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Jan 26 15:22:35 compute-1 podman[192725]: time="2026-01-26T15:22:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:22:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:22:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16575 "" "Go-http-client/1.1"
Jan 26 15:22:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:22:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2658 "" "Go-http-client/1.1"
Jan 26 15:22:36 compute-1 nova_compute[183403]: 2026-01-26 15:22:36.495 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:22:38 compute-1 ovn_controller[95641]: 2026-01-26T15:22:38Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:16:67:c8 10.100.0.3
Jan 26 15:22:38 compute-1 ovn_controller[95641]: 2026-01-26T15:22:38Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:16:67:c8 10.100.0.3
Jan 26 15:22:40 compute-1 nova_compute[183403]: 2026-01-26 15:22:40.009 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:22:40 compute-1 podman[209985]: 2026-01-26 15:22:40.935082161 +0000 UTC m=+0.092722337 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, io.openshift.expose-services=, vendor=Red Hat, Inc., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., architecture=x86_64, config_id=openstack_network_exporter, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9)
Jan 26 15:22:40 compute-1 podman[209984]: 2026-01-26 15:22:40.95808397 +0000 UTC m=+0.114158515 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 15:22:41 compute-1 nova_compute[183403]: 2026-01-26 15:22:41.499 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:22:42 compute-1 nova_compute[183403]: 2026-01-26 15:22:42.721 183407 DEBUG nova.virt.libvirt.driver [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Check if temp file /var/lib/nova/instances/tmpxc9ohlpa exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Jan 26 15:22:42 compute-1 nova_compute[183403]: 2026-01-26 15:22:42.727 183407 DEBUG nova.compute.manager [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpxc9ohlpa',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='67d3bb43-d956-4beb-8227-316914d585d8',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9298
Jan 26 15:22:45 compute-1 nova_compute[183403]: 2026-01-26 15:22:45.013 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:22:46 compute-1 nova_compute[183403]: 2026-01-26 15:22:46.502 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:22:46 compute-1 nova_compute[183403]: 2026-01-26 15:22:46.920 183407 DEBUG oslo_concurrency.processutils [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/67d3bb43-d956-4beb-8227-316914d585d8/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:22:47 compute-1 nova_compute[183403]: 2026-01-26 15:22:47.009 183407 DEBUG oslo_concurrency.processutils [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/67d3bb43-d956-4beb-8227-316914d585d8/disk --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:22:47 compute-1 nova_compute[183403]: 2026-01-26 15:22:47.010 183407 DEBUG oslo_concurrency.processutils [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/67d3bb43-d956-4beb-8227-316914d585d8/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:22:47 compute-1 nova_compute[183403]: 2026-01-26 15:22:47.108 183407 DEBUG oslo_concurrency.processutils [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/67d3bb43-d956-4beb-8227-316914d585d8/disk --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:22:47 compute-1 nova_compute[183403]: 2026-01-26 15:22:47.111 183407 DEBUG nova.compute.manager [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Preparing to wait for external event network-vif-plugged-2a825112-3789-4aab-b884-68a58a42a2fe prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 26 15:22:47 compute-1 nova_compute[183403]: 2026-01-26 15:22:47.111 183407 DEBUG oslo_concurrency.lockutils [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "67d3bb43-d956-4beb-8227-316914d585d8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:22:47 compute-1 nova_compute[183403]: 2026-01-26 15:22:47.112 183407 DEBUG oslo_concurrency.lockutils [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "67d3bb43-d956-4beb-8227-316914d585d8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:22:47 compute-1 nova_compute[183403]: 2026-01-26 15:22:47.113 183407 DEBUG oslo_concurrency.lockutils [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "67d3bb43-d956-4beb-8227-316914d585d8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:22:48 compute-1 podman[210036]: 2026-01-26 15:22:48.928529047 +0000 UTC m=+0.095863737 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Jan 26 15:22:48 compute-1 podman[210035]: 2026-01-26 15:22:48.966162186 +0000 UTC m=+0.131173993 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 26 15:22:49 compute-1 openstack_network_exporter[195610]: ERROR   15:22:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:22:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:22:49 compute-1 openstack_network_exporter[195610]: ERROR   15:22:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:22:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:22:50 compute-1 nova_compute[183403]: 2026-01-26 15:22:50.015 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:22:51 compute-1 nova_compute[183403]: 2026-01-26 15:22:51.535 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:22:53 compute-1 nova_compute[183403]: 2026-01-26 15:22:53.488 183407 DEBUG nova.compute.manager [req-5554746a-4556-4f1f-89b9-891c4b074b6e req-c3e6b05a-9b90-49eb-93dd-342c1c9b8247 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Received event network-vif-unplugged-2a825112-3789-4aab-b884-68a58a42a2fe external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:22:53 compute-1 nova_compute[183403]: 2026-01-26 15:22:53.488 183407 DEBUG oslo_concurrency.lockutils [req-5554746a-4556-4f1f-89b9-891c4b074b6e req-c3e6b05a-9b90-49eb-93dd-342c1c9b8247 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "67d3bb43-d956-4beb-8227-316914d585d8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:22:53 compute-1 nova_compute[183403]: 2026-01-26 15:22:53.489 183407 DEBUG oslo_concurrency.lockutils [req-5554746a-4556-4f1f-89b9-891c4b074b6e req-c3e6b05a-9b90-49eb-93dd-342c1c9b8247 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "67d3bb43-d956-4beb-8227-316914d585d8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:22:53 compute-1 nova_compute[183403]: 2026-01-26 15:22:53.489 183407 DEBUG oslo_concurrency.lockutils [req-5554746a-4556-4f1f-89b9-891c4b074b6e req-c3e6b05a-9b90-49eb-93dd-342c1c9b8247 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "67d3bb43-d956-4beb-8227-316914d585d8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:22:53 compute-1 nova_compute[183403]: 2026-01-26 15:22:53.490 183407 DEBUG nova.compute.manager [req-5554746a-4556-4f1f-89b9-891c4b074b6e req-c3e6b05a-9b90-49eb-93dd-342c1c9b8247 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] No event matching network-vif-unplugged-2a825112-3789-4aab-b884-68a58a42a2fe in dict_keys([('network-vif-plugged', '2a825112-3789-4aab-b884-68a58a42a2fe')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:350
Jan 26 15:22:53 compute-1 nova_compute[183403]: 2026-01-26 15:22:53.490 183407 DEBUG nova.compute.manager [req-5554746a-4556-4f1f-89b9-891c4b074b6e req-c3e6b05a-9b90-49eb-93dd-342c1c9b8247 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Received event network-vif-unplugged-2a825112-3789-4aab-b884-68a58a42a2fe for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 15:22:54 compute-1 nova_compute[183403]: 2026-01-26 15:22:54.641 183407 INFO nova.compute.manager [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Took 7.53 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.
Jan 26 15:22:54 compute-1 ovn_controller[95641]: 2026-01-26T15:22:54Z|00148|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 26 15:22:55 compute-1 nova_compute[183403]: 2026-01-26 15:22:55.016 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:22:55 compute-1 nova_compute[183403]: 2026-01-26 15:22:55.544 183407 DEBUG nova.compute.manager [req-26f5f7d5-9e32-4601-ac96-66521932ed53 req-91907ffb-6ed6-46fd-b453-14196a6ed995 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Received event network-vif-plugged-2a825112-3789-4aab-b884-68a58a42a2fe external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:22:55 compute-1 nova_compute[183403]: 2026-01-26 15:22:55.544 183407 DEBUG oslo_concurrency.lockutils [req-26f5f7d5-9e32-4601-ac96-66521932ed53 req-91907ffb-6ed6-46fd-b453-14196a6ed995 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "67d3bb43-d956-4beb-8227-316914d585d8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:22:55 compute-1 nova_compute[183403]: 2026-01-26 15:22:55.545 183407 DEBUG oslo_concurrency.lockutils [req-26f5f7d5-9e32-4601-ac96-66521932ed53 req-91907ffb-6ed6-46fd-b453-14196a6ed995 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "67d3bb43-d956-4beb-8227-316914d585d8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:22:55 compute-1 nova_compute[183403]: 2026-01-26 15:22:55.545 183407 DEBUG oslo_concurrency.lockutils [req-26f5f7d5-9e32-4601-ac96-66521932ed53 req-91907ffb-6ed6-46fd-b453-14196a6ed995 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "67d3bb43-d956-4beb-8227-316914d585d8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:22:55 compute-1 nova_compute[183403]: 2026-01-26 15:22:55.546 183407 DEBUG nova.compute.manager [req-26f5f7d5-9e32-4601-ac96-66521932ed53 req-91907ffb-6ed6-46fd-b453-14196a6ed995 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Processing event network-vif-plugged-2a825112-3789-4aab-b884-68a58a42a2fe _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 26 15:22:55 compute-1 nova_compute[183403]: 2026-01-26 15:22:55.546 183407 DEBUG nova.compute.manager [req-26f5f7d5-9e32-4601-ac96-66521932ed53 req-91907ffb-6ed6-46fd-b453-14196a6ed995 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Received event network-changed-2a825112-3789-4aab-b884-68a58a42a2fe external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:22:55 compute-1 nova_compute[183403]: 2026-01-26 15:22:55.546 183407 DEBUG nova.compute.manager [req-26f5f7d5-9e32-4601-ac96-66521932ed53 req-91907ffb-6ed6-46fd-b453-14196a6ed995 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Refreshing instance network info cache due to event network-changed-2a825112-3789-4aab-b884-68a58a42a2fe. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 26 15:22:55 compute-1 nova_compute[183403]: 2026-01-26 15:22:55.546 183407 DEBUG oslo_concurrency.lockutils [req-26f5f7d5-9e32-4601-ac96-66521932ed53 req-91907ffb-6ed6-46fd-b453-14196a6ed995 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "refresh_cache-67d3bb43-d956-4beb-8227-316914d585d8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 15:22:55 compute-1 nova_compute[183403]: 2026-01-26 15:22:55.546 183407 DEBUG oslo_concurrency.lockutils [req-26f5f7d5-9e32-4601-ac96-66521932ed53 req-91907ffb-6ed6-46fd-b453-14196a6ed995 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquired lock "refresh_cache-67d3bb43-d956-4beb-8227-316914d585d8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 15:22:55 compute-1 nova_compute[183403]: 2026-01-26 15:22:55.547 183407 DEBUG nova.network.neutron [req-26f5f7d5-9e32-4601-ac96-66521932ed53 req-91907ffb-6ed6-46fd-b453-14196a6ed995 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Refreshing network info cache for port 2a825112-3789-4aab-b884-68a58a42a2fe _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 26 15:22:55 compute-1 nova_compute[183403]: 2026-01-26 15:22:55.548 183407 DEBUG nova.compute.manager [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 26 15:22:56 compute-1 nova_compute[183403]: 2026-01-26 15:22:56.055 183407 DEBUG nova.compute.manager [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpxc9ohlpa',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='67d3bb43-d956-4beb-8227-316914d585d8',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(c6e4073f-8f9c-4399-9bad-86e872650d5e),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9663
Jan 26 15:22:56 compute-1 nova_compute[183403]: 2026-01-26 15:22:56.056 183407 WARNING neutronclient.v2_0.client [req-26f5f7d5-9e32-4601-ac96-66521932ed53 req-91907ffb-6ed6-46fd-b453-14196a6ed995 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:22:56 compute-1 nova_compute[183403]: 2026-01-26 15:22:56.567 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:22:56 compute-1 nova_compute[183403]: 2026-01-26 15:22:56.572 183407 DEBUG nova.objects.instance [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lazy-loading 'migration_context' on Instance uuid 67d3bb43-d956-4beb-8227-316914d585d8 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:22:56 compute-1 nova_compute[183403]: 2026-01-26 15:22:56.573 183407 DEBUG nova.virt.libvirt.driver [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Jan 26 15:22:56 compute-1 nova_compute[183403]: 2026-01-26 15:22:56.575 183407 DEBUG nova.virt.libvirt.driver [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Jan 26 15:22:56 compute-1 nova_compute[183403]: 2026-01-26 15:22:56.575 183407 DEBUG nova.virt.libvirt.driver [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Jan 26 15:22:56 compute-1 nova_compute[183403]: 2026-01-26 15:22:56.959 183407 WARNING neutronclient.v2_0.client [req-26f5f7d5-9e32-4601-ac96-66521932ed53 req-91907ffb-6ed6-46fd-b453-14196a6ed995 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:22:57 compute-1 nova_compute[183403]: 2026-01-26 15:22:57.078 183407 DEBUG nova.virt.libvirt.driver [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Jan 26 15:22:57 compute-1 nova_compute[183403]: 2026-01-26 15:22:57.078 183407 DEBUG nova.virt.libvirt.driver [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Jan 26 15:22:57 compute-1 nova_compute[183403]: 2026-01-26 15:22:57.086 183407 DEBUG nova.virt.libvirt.vif [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2026-01-26T15:22:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-29266541',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-292',id=17,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T15:22:25Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e07345fa9028494086d0d062e5c6d037',ramdisk_id='',reservation_id='r-s8t7aae9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1702173696',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1702173696-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T15:22:25Z,user_data=None,user_id='0c77b3ed882642e3b0c7840dc8efc49a',uuid=67d3bb43-d956-4beb-8227-316914d585d8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2a825112-3789-4aab-b884-68a58a42a2fe", "address": "fa:16:3e:16:67:c8", "network": {"id": "3e9dfa8e-4100-40c2-b5c3-611e27e3b601", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-854362176-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9147aacc62b34487b1727939f1a92703", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap2a825112-37", "ovs_interfaceid": "2a825112-3789-4aab-b884-68a58a42a2fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 26 15:22:57 compute-1 nova_compute[183403]: 2026-01-26 15:22:57.087 183407 DEBUG nova.network.os_vif_util [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Converting VIF {"id": "2a825112-3789-4aab-b884-68a58a42a2fe", "address": "fa:16:3e:16:67:c8", "network": {"id": "3e9dfa8e-4100-40c2-b5c3-611e27e3b601", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-854362176-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9147aacc62b34487b1727939f1a92703", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap2a825112-37", "ovs_interfaceid": "2a825112-3789-4aab-b884-68a58a42a2fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:22:57 compute-1 nova_compute[183403]: 2026-01-26 15:22:57.088 183407 DEBUG nova.network.os_vif_util [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:16:67:c8,bridge_name='br-int',has_traffic_filtering=True,id=2a825112-3789-4aab-b884-68a58a42a2fe,network=Network(3e9dfa8e-4100-40c2-b5c3-611e27e3b601),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a825112-37') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:22:57 compute-1 nova_compute[183403]: 2026-01-26 15:22:57.089 183407 DEBUG nova.virt.libvirt.migration [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Updating guest XML with vif config: <interface type="ethernet">
Jan 26 15:22:57 compute-1 nova_compute[183403]:   <mac address="fa:16:3e:16:67:c8"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   <model type="virtio"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   <driver name="vhost" rx_queue_size="512"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   <mtu size="1442"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   <target dev="tap2a825112-37"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]: </interface>
Jan 26 15:22:57 compute-1 nova_compute[183403]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Jan 26 15:22:57 compute-1 nova_compute[183403]: 2026-01-26 15:22:57.090 183407 DEBUG nova.virt.libvirt.migration [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Jan 26 15:22:57 compute-1 nova_compute[183403]:   <name>instance-00000011</name>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   <uuid>67d3bb43-d956-4beb-8227-316914d585d8</uuid>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   <metadata>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <nova:name>tempest-TestExecuteNodeResourceConsolidationStrategy-server-29266541</nova:name>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <nova:creationTime>2026-01-26 15:22:20</nova:creationTime>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <nova:flavor name="m1.nano" id="74480e15-23e6-4569-8ef9-3ddf5ac8b981">
Jan 26 15:22:57 compute-1 nova_compute[183403]:         <nova:memory>128</nova:memory>
Jan 26 15:22:57 compute-1 nova_compute[183403]:         <nova:disk>1</nova:disk>
Jan 26 15:22:57 compute-1 nova_compute[183403]:         <nova:swap>0</nova:swap>
Jan 26 15:22:57 compute-1 nova_compute[183403]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 15:22:57 compute-1 nova_compute[183403]:         <nova:vcpus>1</nova:vcpus>
Jan 26 15:22:57 compute-1 nova_compute[183403]:         <nova:extraSpecs>
Jan 26 15:22:57 compute-1 nova_compute[183403]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 15:22:57 compute-1 nova_compute[183403]:         </nova:extraSpecs>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       </nova:flavor>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <nova:image uuid="354e4d0e-4287-404f-93d3-2c85cfe92fbc">
Jan 26 15:22:57 compute-1 nova_compute[183403]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 15:22:57 compute-1 nova_compute[183403]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 15:22:57 compute-1 nova_compute[183403]:         <nova:minDisk>1</nova:minDisk>
Jan 26 15:22:57 compute-1 nova_compute[183403]:         <nova:minRam>0</nova:minRam>
Jan 26 15:22:57 compute-1 nova_compute[183403]:         <nova:properties>
Jan 26 15:22:57 compute-1 nova_compute[183403]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 15:22:57 compute-1 nova_compute[183403]:         </nova:properties>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       </nova:image>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <nova:owner>
Jan 26 15:22:57 compute-1 nova_compute[183403]:         <nova:user uuid="0c77b3ed882642e3b0c7840dc8efc49a">tempest-TestExecuteNodeResourceConsolidationStrategy-1702173696-project-admin</nova:user>
Jan 26 15:22:57 compute-1 nova_compute[183403]:         <nova:project uuid="e07345fa9028494086d0d062e5c6d037">tempest-TestExecuteNodeResourceConsolidationStrategy-1702173696</nova:project>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       </nova:owner>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <nova:root type="image" uuid="354e4d0e-4287-404f-93d3-2c85cfe92fbc"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <nova:ports>
Jan 26 15:22:57 compute-1 nova_compute[183403]:         <nova:port uuid="2a825112-3789-4aab-b884-68a58a42a2fe">
Jan 26 15:22:57 compute-1 nova_compute[183403]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:         </nova:port>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       </nova:ports>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </nova:instance>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   </metadata>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   <memory unit="KiB">131072</memory>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   <currentMemory unit="KiB">131072</currentMemory>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   <vcpu placement="static">1</vcpu>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   <resource>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <partition>/machine</partition>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   </resource>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   <sysinfo type="smbios">
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <system>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <entry name="manufacturer">RDO</entry>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <entry name="product">OpenStack Compute</entry>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <entry name="serial">67d3bb43-d956-4beb-8227-316914d585d8</entry>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <entry name="uuid">67d3bb43-d956-4beb-8227-316914d585d8</entry>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <entry name="family">Virtual Machine</entry>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </system>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   </sysinfo>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   <os>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <boot dev="hd"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <smbios mode="sysinfo"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   </os>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   <features>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <acpi/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <apic/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <vmcoreinfo state="on"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   </features>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   <cpu mode="custom" match="exact" check="partial">
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <model fallback="allow">Nehalem</model>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   </cpu>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   <clock offset="utc">
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <timer name="hpet" present="no"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   </clock>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   <on_poweroff>destroy</on_poweroff>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   <on_reboot>restart</on_reboot>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   <on_crash>destroy</on_crash>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   <devices>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <disk type="file" device="disk">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <source file="/var/lib/nova/instances/67d3bb43-d956-4beb-8227-316914d585d8/disk"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target dev="vda" bus="virtio"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </disk>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <disk type="file" device="cdrom">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <source file="/var/lib/nova/instances/67d3bb43-d956-4beb-8227-316914d585d8/disk.config"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target dev="sda" bus="sata"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <readonly/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </disk>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="0" model="pcie-root"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="1" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="1" port="0x10"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="2" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="2" port="0x11"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="3" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="3" port="0x12"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="4" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="4" port="0x13"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="5" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="5" port="0x14"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="6" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="6" port="0x15"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="7" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="7" port="0x16"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="8" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="8" port="0x17"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="9" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="9" port="0x18"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="10" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="10" port="0x19"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="11" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="11" port="0x1a"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="12" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="12" port="0x1b"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="13" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="13" port="0x1c"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="14" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="14" port="0x1d"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="15" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="15" port="0x1e"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="16" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="16" port="0x1f"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="17" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="17" port="0x20"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="18" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="18" port="0x21"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="19" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="19" port="0x22"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="20" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="20" port="0x23"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="21" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="21" port="0x24"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="22" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="22" port="0x25"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="23" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="23" port="0x26"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="24" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="24" port="0x27"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="25" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="25" port="0x28"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-pci-bridge"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="usb" index="0" model="piix3-uhci">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="sata" index="0">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <interface type="ethernet"><mac address="fa:16:3e:16:67:c8"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap2a825112-37"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </interface><serial type="pty">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <log file="/var/lib/nova/instances/67d3bb43-d956-4beb-8227-316914d585d8/console.log" append="off"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target type="isa-serial" port="0">
Jan 26 15:22:57 compute-1 nova_compute[183403]:         <model name="isa-serial"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       </target>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </serial>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <console type="pty">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <log file="/var/lib/nova/instances/67d3bb43-d956-4beb-8227-316914d585d8/console.log" append="off"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target type="serial" port="0"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </console>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <input type="tablet" bus="usb">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="usb" bus="0" port="1"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </input>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <input type="mouse" bus="ps2"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <listen type="address" address="::"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </graphics>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <video>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model type="virtio" heads="1" primary="yes"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </video>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <stats period="10"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </memballoon>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <rng model="virtio">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <backend model="random">/dev/urandom</backend>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </rng>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   </devices>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]: </domain>
Jan 26 15:22:57 compute-1 nova_compute[183403]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Jan 26 15:22:57 compute-1 nova_compute[183403]: 2026-01-26 15:22:57.092 183407 DEBUG nova.virt.libvirt.migration [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Jan 26 15:22:57 compute-1 nova_compute[183403]:   <name>instance-00000011</name>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   <uuid>67d3bb43-d956-4beb-8227-316914d585d8</uuid>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   <metadata>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <nova:name>tempest-TestExecuteNodeResourceConsolidationStrategy-server-29266541</nova:name>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <nova:creationTime>2026-01-26 15:22:20</nova:creationTime>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <nova:flavor name="m1.nano" id="74480e15-23e6-4569-8ef9-3ddf5ac8b981">
Jan 26 15:22:57 compute-1 nova_compute[183403]:         <nova:memory>128</nova:memory>
Jan 26 15:22:57 compute-1 nova_compute[183403]:         <nova:disk>1</nova:disk>
Jan 26 15:22:57 compute-1 nova_compute[183403]:         <nova:swap>0</nova:swap>
Jan 26 15:22:57 compute-1 nova_compute[183403]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 15:22:57 compute-1 nova_compute[183403]:         <nova:vcpus>1</nova:vcpus>
Jan 26 15:22:57 compute-1 nova_compute[183403]:         <nova:extraSpecs>
Jan 26 15:22:57 compute-1 nova_compute[183403]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 15:22:57 compute-1 nova_compute[183403]:         </nova:extraSpecs>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       </nova:flavor>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <nova:image uuid="354e4d0e-4287-404f-93d3-2c85cfe92fbc">
Jan 26 15:22:57 compute-1 nova_compute[183403]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 15:22:57 compute-1 nova_compute[183403]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 15:22:57 compute-1 nova_compute[183403]:         <nova:minDisk>1</nova:minDisk>
Jan 26 15:22:57 compute-1 nova_compute[183403]:         <nova:minRam>0</nova:minRam>
Jan 26 15:22:57 compute-1 nova_compute[183403]:         <nova:properties>
Jan 26 15:22:57 compute-1 nova_compute[183403]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 15:22:57 compute-1 nova_compute[183403]:         </nova:properties>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       </nova:image>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <nova:owner>
Jan 26 15:22:57 compute-1 nova_compute[183403]:         <nova:user uuid="0c77b3ed882642e3b0c7840dc8efc49a">tempest-TestExecuteNodeResourceConsolidationStrategy-1702173696-project-admin</nova:user>
Jan 26 15:22:57 compute-1 nova_compute[183403]:         <nova:project uuid="e07345fa9028494086d0d062e5c6d037">tempest-TestExecuteNodeResourceConsolidationStrategy-1702173696</nova:project>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       </nova:owner>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <nova:root type="image" uuid="354e4d0e-4287-404f-93d3-2c85cfe92fbc"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <nova:ports>
Jan 26 15:22:57 compute-1 nova_compute[183403]:         <nova:port uuid="2a825112-3789-4aab-b884-68a58a42a2fe">
Jan 26 15:22:57 compute-1 nova_compute[183403]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:         </nova:port>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       </nova:ports>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </nova:instance>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   </metadata>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   <memory unit="KiB">131072</memory>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   <currentMemory unit="KiB">131072</currentMemory>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   <vcpu placement="static">1</vcpu>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   <resource>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <partition>/machine</partition>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   </resource>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   <sysinfo type="smbios">
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <system>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <entry name="manufacturer">RDO</entry>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <entry name="product">OpenStack Compute</entry>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <entry name="serial">67d3bb43-d956-4beb-8227-316914d585d8</entry>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <entry name="uuid">67d3bb43-d956-4beb-8227-316914d585d8</entry>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <entry name="family">Virtual Machine</entry>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </system>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   </sysinfo>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   <os>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <boot dev="hd"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <smbios mode="sysinfo"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   </os>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   <features>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <acpi/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <apic/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <vmcoreinfo state="on"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   </features>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   <cpu mode="custom" match="exact" check="partial">
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <model fallback="allow">Nehalem</model>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   </cpu>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   <clock offset="utc">
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <timer name="hpet" present="no"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   </clock>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   <on_poweroff>destroy</on_poweroff>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   <on_reboot>restart</on_reboot>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   <on_crash>destroy</on_crash>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   <devices>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <disk type="file" device="disk">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <source file="/var/lib/nova/instances/67d3bb43-d956-4beb-8227-316914d585d8/disk"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target dev="vda" bus="virtio"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </disk>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <disk type="file" device="cdrom">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <source file="/var/lib/nova/instances/67d3bb43-d956-4beb-8227-316914d585d8/disk.config"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target dev="sda" bus="sata"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <readonly/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </disk>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="0" model="pcie-root"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="1" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="1" port="0x10"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="2" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="2" port="0x11"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="3" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="3" port="0x12"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="4" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="4" port="0x13"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="5" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="5" port="0x14"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="6" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="6" port="0x15"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="7" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="7" port="0x16"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="8" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="8" port="0x17"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="9" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="9" port="0x18"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="10" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="10" port="0x19"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="11" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="11" port="0x1a"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="12" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="12" port="0x1b"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="13" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="13" port="0x1c"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="14" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="14" port="0x1d"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="15" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="15" port="0x1e"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="16" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="16" port="0x1f"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="17" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="17" port="0x20"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="18" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="18" port="0x21"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="19" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="19" port="0x22"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="20" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="20" port="0x23"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="21" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="21" port="0x24"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="22" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="22" port="0x25"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="23" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="23" port="0x26"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="24" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="24" port="0x27"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="25" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="25" port="0x28"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-pci-bridge"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="usb" index="0" model="piix3-uhci">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="sata" index="0">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <interface type="ethernet"><mac address="fa:16:3e:16:67:c8"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap2a825112-37"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </interface><serial type="pty">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <log file="/var/lib/nova/instances/67d3bb43-d956-4beb-8227-316914d585d8/console.log" append="off"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target type="isa-serial" port="0">
Jan 26 15:22:57 compute-1 nova_compute[183403]:         <model name="isa-serial"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       </target>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </serial>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <console type="pty">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <log file="/var/lib/nova/instances/67d3bb43-d956-4beb-8227-316914d585d8/console.log" append="off"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target type="serial" port="0"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </console>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <input type="tablet" bus="usb">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="usb" bus="0" port="1"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </input>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <input type="mouse" bus="ps2"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <listen type="address" address="::"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </graphics>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <video>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model type="virtio" heads="1" primary="yes"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </video>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <stats period="10"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </memballoon>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <rng model="virtio">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <backend model="random">/dev/urandom</backend>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </rng>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   </devices>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]: </domain>
Jan 26 15:22:57 compute-1 nova_compute[183403]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Jan 26 15:22:57 compute-1 nova_compute[183403]: 2026-01-26 15:22:57.094 183407 DEBUG nova.virt.libvirt.migration [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] _update_pci_xml output xml=<domain type="kvm">
Jan 26 15:22:57 compute-1 nova_compute[183403]:   <name>instance-00000011</name>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   <uuid>67d3bb43-d956-4beb-8227-316914d585d8</uuid>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   <metadata>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <nova:name>tempest-TestExecuteNodeResourceConsolidationStrategy-server-29266541</nova:name>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <nova:creationTime>2026-01-26 15:22:20</nova:creationTime>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <nova:flavor name="m1.nano" id="74480e15-23e6-4569-8ef9-3ddf5ac8b981">
Jan 26 15:22:57 compute-1 nova_compute[183403]:         <nova:memory>128</nova:memory>
Jan 26 15:22:57 compute-1 nova_compute[183403]:         <nova:disk>1</nova:disk>
Jan 26 15:22:57 compute-1 nova_compute[183403]:         <nova:swap>0</nova:swap>
Jan 26 15:22:57 compute-1 nova_compute[183403]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 15:22:57 compute-1 nova_compute[183403]:         <nova:vcpus>1</nova:vcpus>
Jan 26 15:22:57 compute-1 nova_compute[183403]:         <nova:extraSpecs>
Jan 26 15:22:57 compute-1 nova_compute[183403]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 15:22:57 compute-1 nova_compute[183403]:         </nova:extraSpecs>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       </nova:flavor>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <nova:image uuid="354e4d0e-4287-404f-93d3-2c85cfe92fbc">
Jan 26 15:22:57 compute-1 nova_compute[183403]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 15:22:57 compute-1 nova_compute[183403]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 15:22:57 compute-1 nova_compute[183403]:         <nova:minDisk>1</nova:minDisk>
Jan 26 15:22:57 compute-1 nova_compute[183403]:         <nova:minRam>0</nova:minRam>
Jan 26 15:22:57 compute-1 nova_compute[183403]:         <nova:properties>
Jan 26 15:22:57 compute-1 nova_compute[183403]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 15:22:57 compute-1 nova_compute[183403]:         </nova:properties>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       </nova:image>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <nova:owner>
Jan 26 15:22:57 compute-1 nova_compute[183403]:         <nova:user uuid="0c77b3ed882642e3b0c7840dc8efc49a">tempest-TestExecuteNodeResourceConsolidationStrategy-1702173696-project-admin</nova:user>
Jan 26 15:22:57 compute-1 nova_compute[183403]:         <nova:project uuid="e07345fa9028494086d0d062e5c6d037">tempest-TestExecuteNodeResourceConsolidationStrategy-1702173696</nova:project>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       </nova:owner>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <nova:root type="image" uuid="354e4d0e-4287-404f-93d3-2c85cfe92fbc"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <nova:ports>
Jan 26 15:22:57 compute-1 nova_compute[183403]:         <nova:port uuid="2a825112-3789-4aab-b884-68a58a42a2fe">
Jan 26 15:22:57 compute-1 nova_compute[183403]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:         </nova:port>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       </nova:ports>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </nova:instance>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   </metadata>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   <memory unit="KiB">131072</memory>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   <currentMemory unit="KiB">131072</currentMemory>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   <vcpu placement="static">1</vcpu>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   <resource>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <partition>/machine</partition>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   </resource>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   <sysinfo type="smbios">
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <system>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <entry name="manufacturer">RDO</entry>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <entry name="product">OpenStack Compute</entry>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <entry name="serial">67d3bb43-d956-4beb-8227-316914d585d8</entry>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <entry name="uuid">67d3bb43-d956-4beb-8227-316914d585d8</entry>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <entry name="family">Virtual Machine</entry>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </system>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   </sysinfo>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   <os>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <boot dev="hd"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <smbios mode="sysinfo"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   </os>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   <features>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <acpi/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <apic/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <vmcoreinfo state="on"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   </features>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   <cpu mode="custom" match="exact" check="partial">
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <model fallback="allow">Nehalem</model>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   </cpu>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   <clock offset="utc">
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <timer name="hpet" present="no"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   </clock>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   <on_poweroff>destroy</on_poweroff>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   <on_reboot>restart</on_reboot>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   <on_crash>destroy</on_crash>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   <devices>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <disk type="file" device="disk">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <source file="/var/lib/nova/instances/67d3bb43-d956-4beb-8227-316914d585d8/disk"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target dev="vda" bus="virtio"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </disk>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <disk type="file" device="cdrom">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <source file="/var/lib/nova/instances/67d3bb43-d956-4beb-8227-316914d585d8/disk.config"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target dev="sda" bus="sata"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <readonly/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </disk>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="0" model="pcie-root"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="1" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="1" port="0x10"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="2" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="2" port="0x11"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="3" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="3" port="0x12"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="4" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="4" port="0x13"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="5" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="5" port="0x14"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="6" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="6" port="0x15"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="7" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="7" port="0x16"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="8" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="8" port="0x17"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="9" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="9" port="0x18"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="10" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="10" port="0x19"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="11" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="11" port="0x1a"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="12" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="12" port="0x1b"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="13" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="13" port="0x1c"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="14" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="14" port="0x1d"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="15" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="15" port="0x1e"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="16" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="16" port="0x1f"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="17" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="17" port="0x20"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="18" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="18" port="0x21"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="19" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="19" port="0x22"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="20" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="20" port="0x23"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="21" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="21" port="0x24"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="22" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="22" port="0x25"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="23" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="23" port="0x26"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="24" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="24" port="0x27"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="25" model="pcie-root-port">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target chassis="25" port="0x28"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model name="pcie-pci-bridge"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="usb" index="0" model="piix3-uhci">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <controller type="sata" index="0">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <interface type="ethernet"><mac address="fa:16:3e:16:67:c8"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap2a825112-37"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </interface><serial type="pty">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <log file="/var/lib/nova/instances/67d3bb43-d956-4beb-8227-316914d585d8/console.log" append="off"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target type="isa-serial" port="0">
Jan 26 15:22:57 compute-1 nova_compute[183403]:         <model name="isa-serial"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       </target>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </serial>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <console type="pty">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <log file="/var/lib/nova/instances/67d3bb43-d956-4beb-8227-316914d585d8/console.log" append="off"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <target type="serial" port="0"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </console>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <input type="tablet" bus="usb">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="usb" bus="0" port="1"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </input>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <input type="mouse" bus="ps2"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <listen type="address" address="::"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </graphics>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <video>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <model type="virtio" heads="1" primary="yes"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </video>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <stats period="10"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </memballoon>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     <rng model="virtio">
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <backend model="random">/dev/urandom</backend>
Jan 26 15:22:57 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]:     </rng>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   </devices>
Jan 26 15:22:57 compute-1 nova_compute[183403]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Jan 26 15:22:57 compute-1 nova_compute[183403]: </domain>
Jan 26 15:22:57 compute-1 nova_compute[183403]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Jan 26 15:22:57 compute-1 nova_compute[183403]: 2026-01-26 15:22:57.095 183407 DEBUG nova.virt.libvirt.driver [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Jan 26 15:22:57 compute-1 nova_compute[183403]: 2026-01-26 15:22:57.097 183407 DEBUG nova.network.neutron [req-26f5f7d5-9e32-4601-ac96-66521932ed53 req-91907ffb-6ed6-46fd-b453-14196a6ed995 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Updated VIF entry in instance network info cache for port 2a825112-3789-4aab-b884-68a58a42a2fe. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Jan 26 15:22:57 compute-1 nova_compute[183403]: 2026-01-26 15:22:57.097 183407 DEBUG nova.network.neutron [req-26f5f7d5-9e32-4601-ac96-66521932ed53 req-91907ffb-6ed6-46fd-b453-14196a6ed995 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Updating instance_info_cache with network_info: [{"id": "2a825112-3789-4aab-b884-68a58a42a2fe", "address": "fa:16:3e:16:67:c8", "network": {"id": "3e9dfa8e-4100-40c2-b5c3-611e27e3b601", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-854362176-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9147aacc62b34487b1727939f1a92703", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a825112-37", "ovs_interfaceid": "2a825112-3789-4aab-b884-68a58a42a2fe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:22:57 compute-1 nova_compute[183403]: 2026-01-26 15:22:57.581 183407 DEBUG nova.virt.libvirt.migration [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Current None elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Jan 26 15:22:57 compute-1 nova_compute[183403]: 2026-01-26 15:22:57.582 183407 INFO nova.virt.libvirt.migration [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Increasing downtime to 50 ms after 0 sec elapsed time
Jan 26 15:22:57 compute-1 nova_compute[183403]: 2026-01-26 15:22:57.607 183407 DEBUG oslo_concurrency.lockutils [req-26f5f7d5-9e32-4601-ac96-66521932ed53 req-91907ffb-6ed6-46fd-b453-14196a6ed995 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Releasing lock "refresh_cache-67d3bb43-d956-4beb-8227-316914d585d8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 15:22:58 compute-1 nova_compute[183403]: 2026-01-26 15:22:58.603 183407 INFO nova.virt.libvirt.driver [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Jan 26 15:22:59 compute-1 nova_compute[183403]: 2026-01-26 15:22:59.107 183407 DEBUG nova.virt.libvirt.migration [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Jan 26 15:22:59 compute-1 nova_compute[183403]: 2026-01-26 15:22:59.107 183407 DEBUG nova.virt.libvirt.migration [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Jan 26 15:22:59 compute-1 nova_compute[183403]: 2026-01-26 15:22:59.611 183407 DEBUG nova.virt.libvirt.migration [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Current 50 elapsed 3 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Jan 26 15:22:59 compute-1 nova_compute[183403]: 2026-01-26 15:22:59.612 183407 DEBUG nova.virt.libvirt.migration [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Jan 26 15:23:00 compute-1 nova_compute[183403]: 2026-01-26 15:23:00.020 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:23:00 compute-1 nova_compute[183403]: 2026-01-26 15:23:00.673 183407 DEBUG nova.virt.libvirt.migration [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Current 50 elapsed 4 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Jan 26 15:23:00 compute-1 nova_compute[183403]: 2026-01-26 15:23:00.674 183407 DEBUG nova.virt.libvirt.migration [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Jan 26 15:23:00 compute-1 kernel: tap2a825112-37 (unregistering): left promiscuous mode
Jan 26 15:23:00 compute-1 NetworkManager[55716]: <info>  [1769440980.8656] device (tap2a825112-37): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 15:23:00 compute-1 ovn_controller[95641]: 2026-01-26T15:23:00Z|00149|binding|INFO|Releasing lport 2a825112-3789-4aab-b884-68a58a42a2fe from this chassis (sb_readonly=0)
Jan 26 15:23:00 compute-1 ovn_controller[95641]: 2026-01-26T15:23:00Z|00150|binding|INFO|Setting lport 2a825112-3789-4aab-b884-68a58a42a2fe down in Southbound
Jan 26 15:23:00 compute-1 ovn_controller[95641]: 2026-01-26T15:23:00Z|00151|binding|INFO|Removing iface tap2a825112-37 ovn-installed in OVS
Jan 26 15:23:00 compute-1 nova_compute[183403]: 2026-01-26 15:23:00.878 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:23:00 compute-1 nova_compute[183403]: 2026-01-26 15:23:00.880 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:23:00 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:23:00.888 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:16:67:c8 10.100.0.3'], port_security=['fa:16:3e:16:67:c8 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '3e0272b2-d627-4653-a221-12286e3af322'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '67d3bb43-d956-4beb-8227-316914d585d8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3e9dfa8e-4100-40c2-b5c3-611e27e3b601', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e07345fa9028494086d0d062e5c6d037', 'neutron:revision_number': '10', 'neutron:security_group_ids': '965d4ca4-c12d-4e4f-bc5a-34d61b5f1699', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9d8c1c3b-77a7-4237-87f3-33f734737e5d, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], logical_port=2a825112-3789-4aab-b884-68a58a42a2fe) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:23:00 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:23:00.890 104930 INFO neutron.agent.ovn.metadata.agent [-] Port 2a825112-3789-4aab-b884-68a58a42a2fe in datapath 3e9dfa8e-4100-40c2-b5c3-611e27e3b601 unbound from our chassis
Jan 26 15:23:00 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:23:00.892 104930 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3e9dfa8e-4100-40c2-b5c3-611e27e3b601, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 15:23:00 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:23:00.899 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[6fbe77c8-04d5-40b9-9007-140e6a5a652f]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:23:00 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:23:00.901 104930 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3e9dfa8e-4100-40c2-b5c3-611e27e3b601 namespace which is not needed anymore
Jan 26 15:23:00 compute-1 nova_compute[183403]: 2026-01-26 15:23:00.913 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:23:00 compute-1 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000011.scope: Deactivated successfully.
Jan 26 15:23:00 compute-1 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000011.scope: Consumed 14.248s CPU time.
Jan 26 15:23:00 compute-1 systemd-machined[154697]: Machine qemu-13-instance-00000011 terminated.
Jan 26 15:23:01 compute-1 nova_compute[183403]: 2026-01-26 15:23:01.072 183407 DEBUG nova.compute.manager [req-5c5cec1d-06a4-4f01-afd5-eaf397b2362d req-a95e9d25-a608-4d02-9ea6-6d6ac8628d96 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Received event network-vif-unplugged-2a825112-3789-4aab-b884-68a58a42a2fe external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:23:01 compute-1 nova_compute[183403]: 2026-01-26 15:23:01.073 183407 DEBUG oslo_concurrency.lockutils [req-5c5cec1d-06a4-4f01-afd5-eaf397b2362d req-a95e9d25-a608-4d02-9ea6-6d6ac8628d96 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "67d3bb43-d956-4beb-8227-316914d585d8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:23:01 compute-1 nova_compute[183403]: 2026-01-26 15:23:01.074 183407 DEBUG oslo_concurrency.lockutils [req-5c5cec1d-06a4-4f01-afd5-eaf397b2362d req-a95e9d25-a608-4d02-9ea6-6d6ac8628d96 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "67d3bb43-d956-4beb-8227-316914d585d8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:23:01 compute-1 nova_compute[183403]: 2026-01-26 15:23:01.075 183407 DEBUG oslo_concurrency.lockutils [req-5c5cec1d-06a4-4f01-afd5-eaf397b2362d req-a95e9d25-a608-4d02-9ea6-6d6ac8628d96 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "67d3bb43-d956-4beb-8227-316914d585d8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:23:01 compute-1 nova_compute[183403]: 2026-01-26 15:23:01.075 183407 DEBUG nova.compute.manager [req-5c5cec1d-06a4-4f01-afd5-eaf397b2362d req-a95e9d25-a608-4d02-9ea6-6d6ac8628d96 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] No waiting events found dispatching network-vif-unplugged-2a825112-3789-4aab-b884-68a58a42a2fe pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:23:01 compute-1 nova_compute[183403]: 2026-01-26 15:23:01.076 183407 DEBUG nova.compute.manager [req-5c5cec1d-06a4-4f01-afd5-eaf397b2362d req-a95e9d25-a608-4d02-9ea6-6d6ac8628d96 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Received event network-vif-unplugged-2a825112-3789-4aab-b884-68a58a42a2fe for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 15:23:01 compute-1 nova_compute[183403]: 2026-01-26 15:23:01.077 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:23:01 compute-1 nova_compute[183403]: 2026-01-26 15:23:01.079 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:23:01 compute-1 podman[210112]: 2026-01-26 15:23:01.131578958 +0000 UTC m=+0.065818783 container kill bce9f94fe13cc04861f9cb6271bef15812c5f5c5c6f2108df399d7a120f65aa4 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-3e9dfa8e-4100-40c2-b5c3-611e27e3b601, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120)
Jan 26 15:23:01 compute-1 neutron-haproxy-ovnmeta-3e9dfa8e-4100-40c2-b5c3-611e27e3b601[209952]: [NOTICE]   (209956) : haproxy version is 3.0.5-8e879a5
Jan 26 15:23:01 compute-1 neutron-haproxy-ovnmeta-3e9dfa8e-4100-40c2-b5c3-611e27e3b601[209952]: [NOTICE]   (209956) : path to executable is /usr/sbin/haproxy
Jan 26 15:23:01 compute-1 neutron-haproxy-ovnmeta-3e9dfa8e-4100-40c2-b5c3-611e27e3b601[209952]: [WARNING]  (209956) : Exiting Master process...
Jan 26 15:23:01 compute-1 neutron-haproxy-ovnmeta-3e9dfa8e-4100-40c2-b5c3-611e27e3b601[209952]: [ALERT]    (209956) : Current worker (209958) exited with code 143 (Terminated)
Jan 26 15:23:01 compute-1 neutron-haproxy-ovnmeta-3e9dfa8e-4100-40c2-b5c3-611e27e3b601[209952]: [WARNING]  (209956) : All workers exited. Exiting... (0)
Jan 26 15:23:01 compute-1 nova_compute[183403]: 2026-01-26 15:23:01.137 183407 DEBUG nova.virt.libvirt.driver [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Jan 26 15:23:01 compute-1 nova_compute[183403]: 2026-01-26 15:23:01.138 183407 DEBUG nova.virt.libvirt.driver [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Jan 26 15:23:01 compute-1 nova_compute[183403]: 2026-01-26 15:23:01.138 183407 DEBUG nova.virt.libvirt.driver [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Jan 26 15:23:01 compute-1 systemd[1]: libpod-bce9f94fe13cc04861f9cb6271bef15812c5f5c5c6f2108df399d7a120f65aa4.scope: Deactivated successfully.
Jan 26 15:23:01 compute-1 nova_compute[183403]: 2026-01-26 15:23:01.177 183407 DEBUG nova.virt.libvirt.guest [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '67d3bb43-d956-4beb-8227-316914d585d8' (instance-00000011) get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Jan 26 15:23:01 compute-1 nova_compute[183403]: 2026-01-26 15:23:01.178 183407 INFO nova.virt.libvirt.driver [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Migration operation has completed
Jan 26 15:23:01 compute-1 nova_compute[183403]: 2026-01-26 15:23:01.178 183407 INFO nova.compute.manager [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] _post_live_migration() is started..
Jan 26 15:23:01 compute-1 nova_compute[183403]: 2026-01-26 15:23:01.195 183407 WARNING neutronclient.v2_0.client [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:23:01 compute-1 nova_compute[183403]: 2026-01-26 15:23:01.196 183407 WARNING neutronclient.v2_0.client [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:23:01 compute-1 podman[210144]: 2026-01-26 15:23:01.202824217 +0000 UTC m=+0.032373367 container died bce9f94fe13cc04861f9cb6271bef15812c5f5c5c6f2108df399d7a120f65aa4 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-3e9dfa8e-4100-40c2-b5c3-611e27e3b601, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260120)
Jan 26 15:23:01 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bce9f94fe13cc04861f9cb6271bef15812c5f5c5c6f2108df399d7a120f65aa4-userdata-shm.mount: Deactivated successfully.
Jan 26 15:23:01 compute-1 systemd[1]: var-lib-containers-storage-overlay-c3d04bb7779bea1ca078dea6e2d4794f96074ac3bddb557e16c6dc3342c58890-merged.mount: Deactivated successfully.
Jan 26 15:23:01 compute-1 podman[210144]: 2026-01-26 15:23:01.261084595 +0000 UTC m=+0.090633705 container remove bce9f94fe13cc04861f9cb6271bef15812c5f5c5c6f2108df399d7a120f65aa4 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-3e9dfa8e-4100-40c2-b5c3-611e27e3b601, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260120)
Jan 26 15:23:01 compute-1 systemd[1]: libpod-conmon-bce9f94fe13cc04861f9cb6271bef15812c5f5c5c6f2108df399d7a120f65aa4.scope: Deactivated successfully.
Jan 26 15:23:01 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:23:01.273 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[a3dbefc1-3f4a-4b03-bd7f-9c50d9d36690]: (4, ("Mon Jan 26 03:23:01 PM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-3e9dfa8e-4100-40c2-b5c3-611e27e3b601 (bce9f94fe13cc04861f9cb6271bef15812c5f5c5c6f2108df399d7a120f65aa4)\nbce9f94fe13cc04861f9cb6271bef15812c5f5c5c6f2108df399d7a120f65aa4\nMon Jan 26 03:23:01 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-3e9dfa8e-4100-40c2-b5c3-611e27e3b601 (bce9f94fe13cc04861f9cb6271bef15812c5f5c5c6f2108df399d7a120f65aa4)\nbce9f94fe13cc04861f9cb6271bef15812c5f5c5c6f2108df399d7a120f65aa4\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:23:01 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:23:01.276 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[1137cf4c-f74d-4f2d-88e7-6a1a94163876]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:23:01 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:23:01.276 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3e9dfa8e-4100-40c2-b5c3-611e27e3b601.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3e9dfa8e-4100-40c2-b5c3-611e27e3b601.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:23:01 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:23:01.277 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[875e1d5f-c3fd-4fab-89a6-6886513266e2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:23:01 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:23:01.278 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3e9dfa8e-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:23:01 compute-1 nova_compute[183403]: 2026-01-26 15:23:01.281 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:23:01 compute-1 kernel: tap3e9dfa8e-40: left promiscuous mode
Jan 26 15:23:01 compute-1 nova_compute[183403]: 2026-01-26 15:23:01.310 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:23:01 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:23:01.317 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[7bd9c839-5544-454a-a6c1-2970761bbff4]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:23:01 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:23:01.341 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[32e1ff72-2e53-4a53-8106-4798b27de762]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:23:01 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:23:01.344 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[0f653724-5f49-4234-bc3f-ffa1682d87cf]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:23:01 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:23:01.365 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[f93a0e87-44c5-4cf6-86c1-c7a3b750ee11]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 466701, 'reachable_time': 25440, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210171, 'error': None, 'target': 'ovnmeta-3e9dfa8e-4100-40c2-b5c3-611e27e3b601', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:23:01 compute-1 systemd[1]: run-netns-ovnmeta\x2d3e9dfa8e\x2d4100\x2d40c2\x2db5c3\x2d611e27e3b601.mount: Deactivated successfully.
Jan 26 15:23:01 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:23:01.373 105448 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3e9dfa8e-4100-40c2-b5c3-611e27e3b601 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Jan 26 15:23:01 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:23:01.373 105448 DEBUG oslo.privsep.daemon [-] privsep: reply[3e70c35f-1fdc-4e0b-b9c0-d1cc90301cf7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:23:01 compute-1 nova_compute[183403]: 2026-01-26 15:23:01.496 183407 DEBUG nova.compute.manager [req-0da5175d-102d-47d8-8de0-a125ee691a29 req-b5d23e4f-7a01-4f09-b9a3-301403658006 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Received event network-vif-unplugged-2a825112-3789-4aab-b884-68a58a42a2fe external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:23:01 compute-1 nova_compute[183403]: 2026-01-26 15:23:01.497 183407 DEBUG oslo_concurrency.lockutils [req-0da5175d-102d-47d8-8de0-a125ee691a29 req-b5d23e4f-7a01-4f09-b9a3-301403658006 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "67d3bb43-d956-4beb-8227-316914d585d8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:23:01 compute-1 nova_compute[183403]: 2026-01-26 15:23:01.498 183407 DEBUG oslo_concurrency.lockutils [req-0da5175d-102d-47d8-8de0-a125ee691a29 req-b5d23e4f-7a01-4f09-b9a3-301403658006 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "67d3bb43-d956-4beb-8227-316914d585d8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:23:01 compute-1 nova_compute[183403]: 2026-01-26 15:23:01.499 183407 DEBUG oslo_concurrency.lockutils [req-0da5175d-102d-47d8-8de0-a125ee691a29 req-b5d23e4f-7a01-4f09-b9a3-301403658006 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "67d3bb43-d956-4beb-8227-316914d585d8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:23:01 compute-1 nova_compute[183403]: 2026-01-26 15:23:01.499 183407 DEBUG nova.compute.manager [req-0da5175d-102d-47d8-8de0-a125ee691a29 req-b5d23e4f-7a01-4f09-b9a3-301403658006 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] No waiting events found dispatching network-vif-unplugged-2a825112-3789-4aab-b884-68a58a42a2fe pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:23:01 compute-1 nova_compute[183403]: 2026-01-26 15:23:01.500 183407 DEBUG nova.compute.manager [req-0da5175d-102d-47d8-8de0-a125ee691a29 req-b5d23e4f-7a01-4f09-b9a3-301403658006 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Received event network-vif-unplugged-2a825112-3789-4aab-b884-68a58a42a2fe for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 15:23:01 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:23:01.535 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:ac:38', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'de:96:40:90:f3:e5'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:23:01 compute-1 nova_compute[183403]: 2026-01-26 15:23:01.536 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:23:01 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:23:01.537 104930 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 15:23:01 compute-1 nova_compute[183403]: 2026-01-26 15:23:01.569 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:23:01 compute-1 nova_compute[183403]: 2026-01-26 15:23:01.725 183407 DEBUG nova.network.neutron [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Activated binding for port 2a825112-3789-4aab-b884-68a58a42a2fe and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Jan 26 15:23:01 compute-1 nova_compute[183403]: 2026-01-26 15:23:01.727 183407 DEBUG nova.compute.manager [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "2a825112-3789-4aab-b884-68a58a42a2fe", "address": "fa:16:3e:16:67:c8", "network": {"id": "3e9dfa8e-4100-40c2-b5c3-611e27e3b601", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-854362176-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9147aacc62b34487b1727939f1a92703", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a825112-37", "ovs_interfaceid": "2a825112-3789-4aab-b884-68a58a42a2fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10063
Jan 26 15:23:01 compute-1 nova_compute[183403]: 2026-01-26 15:23:01.728 183407 DEBUG nova.virt.libvirt.vif [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2026-01-26T15:22:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-29266541',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-292',id=17,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T15:22:25Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e07345fa9028494086d0d062e5c6d037',ramdisk_id='',reservation_id='r-s8t7aae9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1702173696',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1702173696-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T15:22:38Z,user_data=None,user_id='0c77b3ed882642e3b0c7840dc8efc49a',uuid=67d3bb43-d956-4beb-8227-316914d585d8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2a825112-3789-4aab-b884-68a58a42a2fe", "address": "fa:16:3e:16:67:c8", "network": {"id": "3e9dfa8e-4100-40c2-b5c3-611e27e3b601", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-854362176-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9147aacc62b34487b1727939f1a92703", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a825112-37", "ovs_interfaceid": "2a825112-3789-4aab-b884-68a58a42a2fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 26 15:23:01 compute-1 nova_compute[183403]: 2026-01-26 15:23:01.728 183407 DEBUG nova.network.os_vif_util [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Converting VIF {"id": "2a825112-3789-4aab-b884-68a58a42a2fe", "address": "fa:16:3e:16:67:c8", "network": {"id": "3e9dfa8e-4100-40c2-b5c3-611e27e3b601", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-854362176-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9147aacc62b34487b1727939f1a92703", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a825112-37", "ovs_interfaceid": "2a825112-3789-4aab-b884-68a58a42a2fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:23:01 compute-1 nova_compute[183403]: 2026-01-26 15:23:01.729 183407 DEBUG nova.network.os_vif_util [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:16:67:c8,bridge_name='br-int',has_traffic_filtering=True,id=2a825112-3789-4aab-b884-68a58a42a2fe,network=Network(3e9dfa8e-4100-40c2-b5c3-611e27e3b601),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a825112-37') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:23:01 compute-1 nova_compute[183403]: 2026-01-26 15:23:01.730 183407 DEBUG os_vif [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:16:67:c8,bridge_name='br-int',has_traffic_filtering=True,id=2a825112-3789-4aab-b884-68a58a42a2fe,network=Network(3e9dfa8e-4100-40c2-b5c3-611e27e3b601),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a825112-37') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 26 15:23:01 compute-1 nova_compute[183403]: 2026-01-26 15:23:01.733 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:23:01 compute-1 nova_compute[183403]: 2026-01-26 15:23:01.734 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2a825112-37, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:23:01 compute-1 nova_compute[183403]: 2026-01-26 15:23:01.736 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:23:01 compute-1 nova_compute[183403]: 2026-01-26 15:23:01.738 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:23:01 compute-1 nova_compute[183403]: 2026-01-26 15:23:01.740 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:23:01 compute-1 nova_compute[183403]: 2026-01-26 15:23:01.740 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=67b2c6f9-7a9a-4130-bf0f-14ba84dde7c4) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:23:01 compute-1 nova_compute[183403]: 2026-01-26 15:23:01.742 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:23:01 compute-1 nova_compute[183403]: 2026-01-26 15:23:01.743 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:23:01 compute-1 nova_compute[183403]: 2026-01-26 15:23:01.756 183407 INFO os_vif [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:16:67:c8,bridge_name='br-int',has_traffic_filtering=True,id=2a825112-3789-4aab-b884-68a58a42a2fe,network=Network(3e9dfa8e-4100-40c2-b5c3-611e27e3b601),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a825112-37')
Jan 26 15:23:01 compute-1 nova_compute[183403]: 2026-01-26 15:23:01.757 183407 DEBUG oslo_concurrency.lockutils [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:23:01 compute-1 nova_compute[183403]: 2026-01-26 15:23:01.757 183407 DEBUG oslo_concurrency.lockutils [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:23:01 compute-1 nova_compute[183403]: 2026-01-26 15:23:01.758 183407 DEBUG oslo_concurrency.lockutils [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:23:01 compute-1 nova_compute[183403]: 2026-01-26 15:23:01.759 183407 DEBUG nova.compute.manager [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10086
Jan 26 15:23:01 compute-1 nova_compute[183403]: 2026-01-26 15:23:01.760 183407 INFO nova.virt.libvirt.driver [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Deleting instance files /var/lib/nova/instances/67d3bb43-d956-4beb-8227-316914d585d8_del
Jan 26 15:23:01 compute-1 nova_compute[183403]: 2026-01-26 15:23:01.761 183407 INFO nova.virt.libvirt.driver [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Deletion of /var/lib/nova/instances/67d3bb43-d956-4beb-8227-316914d585d8_del complete
Jan 26 15:23:03 compute-1 nova_compute[183403]: 2026-01-26 15:23:03.139 183407 DEBUG nova.compute.manager [req-a56a0170-caba-434c-a7ed-2c014d96590e req-a38a6219-048e-4e44-bbbb-bad02eb04a42 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Received event network-vif-plugged-2a825112-3789-4aab-b884-68a58a42a2fe external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:23:03 compute-1 nova_compute[183403]: 2026-01-26 15:23:03.140 183407 DEBUG oslo_concurrency.lockutils [req-a56a0170-caba-434c-a7ed-2c014d96590e req-a38a6219-048e-4e44-bbbb-bad02eb04a42 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "67d3bb43-d956-4beb-8227-316914d585d8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:23:03 compute-1 nova_compute[183403]: 2026-01-26 15:23:03.140 183407 DEBUG oslo_concurrency.lockutils [req-a56a0170-caba-434c-a7ed-2c014d96590e req-a38a6219-048e-4e44-bbbb-bad02eb04a42 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "67d3bb43-d956-4beb-8227-316914d585d8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:23:03 compute-1 nova_compute[183403]: 2026-01-26 15:23:03.141 183407 DEBUG oslo_concurrency.lockutils [req-a56a0170-caba-434c-a7ed-2c014d96590e req-a38a6219-048e-4e44-bbbb-bad02eb04a42 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "67d3bb43-d956-4beb-8227-316914d585d8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:23:03 compute-1 nova_compute[183403]: 2026-01-26 15:23:03.141 183407 DEBUG nova.compute.manager [req-a56a0170-caba-434c-a7ed-2c014d96590e req-a38a6219-048e-4e44-bbbb-bad02eb04a42 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] No waiting events found dispatching network-vif-plugged-2a825112-3789-4aab-b884-68a58a42a2fe pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:23:03 compute-1 nova_compute[183403]: 2026-01-26 15:23:03.141 183407 WARNING nova.compute.manager [req-a56a0170-caba-434c-a7ed-2c014d96590e req-a38a6219-048e-4e44-bbbb-bad02eb04a42 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Received unexpected event network-vif-plugged-2a825112-3789-4aab-b884-68a58a42a2fe for instance with vm_state active and task_state migrating.
Jan 26 15:23:03 compute-1 nova_compute[183403]: 2026-01-26 15:23:03.142 183407 DEBUG nova.compute.manager [req-a56a0170-caba-434c-a7ed-2c014d96590e req-a38a6219-048e-4e44-bbbb-bad02eb04a42 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Received event network-vif-unplugged-2a825112-3789-4aab-b884-68a58a42a2fe external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:23:03 compute-1 nova_compute[183403]: 2026-01-26 15:23:03.142 183407 DEBUG oslo_concurrency.lockutils [req-a56a0170-caba-434c-a7ed-2c014d96590e req-a38a6219-048e-4e44-bbbb-bad02eb04a42 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "67d3bb43-d956-4beb-8227-316914d585d8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:23:03 compute-1 nova_compute[183403]: 2026-01-26 15:23:03.142 183407 DEBUG oslo_concurrency.lockutils [req-a56a0170-caba-434c-a7ed-2c014d96590e req-a38a6219-048e-4e44-bbbb-bad02eb04a42 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "67d3bb43-d956-4beb-8227-316914d585d8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:23:03 compute-1 nova_compute[183403]: 2026-01-26 15:23:03.142 183407 DEBUG oslo_concurrency.lockutils [req-a56a0170-caba-434c-a7ed-2c014d96590e req-a38a6219-048e-4e44-bbbb-bad02eb04a42 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "67d3bb43-d956-4beb-8227-316914d585d8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:23:03 compute-1 nova_compute[183403]: 2026-01-26 15:23:03.143 183407 DEBUG nova.compute.manager [req-a56a0170-caba-434c-a7ed-2c014d96590e req-a38a6219-048e-4e44-bbbb-bad02eb04a42 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] No waiting events found dispatching network-vif-unplugged-2a825112-3789-4aab-b884-68a58a42a2fe pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:23:03 compute-1 nova_compute[183403]: 2026-01-26 15:23:03.143 183407 DEBUG nova.compute.manager [req-a56a0170-caba-434c-a7ed-2c014d96590e req-a38a6219-048e-4e44-bbbb-bad02eb04a42 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Received event network-vif-unplugged-2a825112-3789-4aab-b884-68a58a42a2fe for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 15:23:03 compute-1 nova_compute[183403]: 2026-01-26 15:23:03.143 183407 DEBUG nova.compute.manager [req-a56a0170-caba-434c-a7ed-2c014d96590e req-a38a6219-048e-4e44-bbbb-bad02eb04a42 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Received event network-vif-plugged-2a825112-3789-4aab-b884-68a58a42a2fe external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:23:03 compute-1 nova_compute[183403]: 2026-01-26 15:23:03.144 183407 DEBUG oslo_concurrency.lockutils [req-a56a0170-caba-434c-a7ed-2c014d96590e req-a38a6219-048e-4e44-bbbb-bad02eb04a42 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "67d3bb43-d956-4beb-8227-316914d585d8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:23:03 compute-1 nova_compute[183403]: 2026-01-26 15:23:03.144 183407 DEBUG oslo_concurrency.lockutils [req-a56a0170-caba-434c-a7ed-2c014d96590e req-a38a6219-048e-4e44-bbbb-bad02eb04a42 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "67d3bb43-d956-4beb-8227-316914d585d8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:23:03 compute-1 nova_compute[183403]: 2026-01-26 15:23:03.144 183407 DEBUG oslo_concurrency.lockutils [req-a56a0170-caba-434c-a7ed-2c014d96590e req-a38a6219-048e-4e44-bbbb-bad02eb04a42 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "67d3bb43-d956-4beb-8227-316914d585d8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:23:03 compute-1 nova_compute[183403]: 2026-01-26 15:23:03.145 183407 DEBUG nova.compute.manager [req-a56a0170-caba-434c-a7ed-2c014d96590e req-a38a6219-048e-4e44-bbbb-bad02eb04a42 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] No waiting events found dispatching network-vif-plugged-2a825112-3789-4aab-b884-68a58a42a2fe pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:23:03 compute-1 nova_compute[183403]: 2026-01-26 15:23:03.145 183407 WARNING nova.compute.manager [req-a56a0170-caba-434c-a7ed-2c014d96590e req-a38a6219-048e-4e44-bbbb-bad02eb04a42 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Received unexpected event network-vif-plugged-2a825112-3789-4aab-b884-68a58a42a2fe for instance with vm_state active and task_state migrating.
Jan 26 15:23:03 compute-1 nova_compute[183403]: 2026-01-26 15:23:03.145 183407 DEBUG nova.compute.manager [req-a56a0170-caba-434c-a7ed-2c014d96590e req-a38a6219-048e-4e44-bbbb-bad02eb04a42 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Received event network-vif-plugged-2a825112-3789-4aab-b884-68a58a42a2fe external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:23:03 compute-1 nova_compute[183403]: 2026-01-26 15:23:03.145 183407 DEBUG oslo_concurrency.lockutils [req-a56a0170-caba-434c-a7ed-2c014d96590e req-a38a6219-048e-4e44-bbbb-bad02eb04a42 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "67d3bb43-d956-4beb-8227-316914d585d8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:23:03 compute-1 nova_compute[183403]: 2026-01-26 15:23:03.146 183407 DEBUG oslo_concurrency.lockutils [req-a56a0170-caba-434c-a7ed-2c014d96590e req-a38a6219-048e-4e44-bbbb-bad02eb04a42 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "67d3bb43-d956-4beb-8227-316914d585d8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:23:03 compute-1 nova_compute[183403]: 2026-01-26 15:23:03.146 183407 DEBUG oslo_concurrency.lockutils [req-a56a0170-caba-434c-a7ed-2c014d96590e req-a38a6219-048e-4e44-bbbb-bad02eb04a42 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "67d3bb43-d956-4beb-8227-316914d585d8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:23:03 compute-1 nova_compute[183403]: 2026-01-26 15:23:03.146 183407 DEBUG nova.compute.manager [req-a56a0170-caba-434c-a7ed-2c014d96590e req-a38a6219-048e-4e44-bbbb-bad02eb04a42 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] No waiting events found dispatching network-vif-plugged-2a825112-3789-4aab-b884-68a58a42a2fe pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:23:03 compute-1 nova_compute[183403]: 2026-01-26 15:23:03.146 183407 WARNING nova.compute.manager [req-a56a0170-caba-434c-a7ed-2c014d96590e req-a38a6219-048e-4e44-bbbb-bad02eb04a42 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Received unexpected event network-vif-plugged-2a825112-3789-4aab-b884-68a58a42a2fe for instance with vm_state active and task_state migrating.
Jan 26 15:23:05 compute-1 nova_compute[183403]: 2026-01-26 15:23:05.023 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:23:05 compute-1 podman[192725]: time="2026-01-26T15:23:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:23:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:23:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:23:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:23:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2195 "" "Go-http-client/1.1"
Jan 26 15:23:06 compute-1 nova_compute[183403]: 2026-01-26 15:23:06.742 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:23:07 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:23:07.540 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=41380b5a-e321-4ce4-bcc6-ecd563b3c793, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:23:10 compute-1 nova_compute[183403]: 2026-01-26 15:23:10.026 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:23:11 compute-1 nova_compute[183403]: 2026-01-26 15:23:11.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:23:11 compute-1 nova_compute[183403]: 2026-01-26 15:23:11.799 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:23:11 compute-1 podman[210174]: 2026-01-26 15:23:11.930901247 +0000 UTC m=+0.089021701 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, managed_by=edpm_ansible, version=9.6, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, io.openshift.tags=minimal rhel9, release=1755695350, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter)
Jan 26 15:23:11 compute-1 podman[210173]: 2026-01-26 15:23:11.9376698 +0000 UTC m=+0.103519934 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 26 15:23:12 compute-1 nova_compute[183403]: 2026-01-26 15:23:12.310 183407 DEBUG oslo_concurrency.lockutils [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "67d3bb43-d956-4beb-8227-316914d585d8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:23:12 compute-1 nova_compute[183403]: 2026-01-26 15:23:12.311 183407 DEBUG oslo_concurrency.lockutils [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "67d3bb43-d956-4beb-8227-316914d585d8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:23:12 compute-1 nova_compute[183403]: 2026-01-26 15:23:12.311 183407 DEBUG oslo_concurrency.lockutils [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "67d3bb43-d956-4beb-8227-316914d585d8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:23:12 compute-1 nova_compute[183403]: 2026-01-26 15:23:12.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:23:12 compute-1 nova_compute[183403]: 2026-01-26 15:23:12.831 183407 DEBUG oslo_concurrency.lockutils [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:23:12 compute-1 nova_compute[183403]: 2026-01-26 15:23:12.832 183407 DEBUG oslo_concurrency.lockutils [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:23:12 compute-1 nova_compute[183403]: 2026-01-26 15:23:12.832 183407 DEBUG oslo_concurrency.lockutils [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:23:12 compute-1 nova_compute[183403]: 2026-01-26 15:23:12.832 183407 DEBUG nova.compute.resource_tracker [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:23:13 compute-1 nova_compute[183403]: 2026-01-26 15:23:13.088 183407 WARNING nova.virt.libvirt.driver [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:23:13 compute-1 nova_compute[183403]: 2026-01-26 15:23:13.091 183407 DEBUG oslo_concurrency.processutils [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:23:13 compute-1 nova_compute[183403]: 2026-01-26 15:23:13.118 183407 DEBUG oslo_concurrency.processutils [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "env LANG=C uptime" returned: 0 in 0.027s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:23:13 compute-1 nova_compute[183403]: 2026-01-26 15:23:13.119 183407 DEBUG nova.compute.resource_tracker [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5809MB free_disk=73.14473342895508GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:23:13 compute-1 nova_compute[183403]: 2026-01-26 15:23:13.120 183407 DEBUG oslo_concurrency.lockutils [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:23:13 compute-1 nova_compute[183403]: 2026-01-26 15:23:13.120 183407 DEBUG oslo_concurrency.lockutils [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:23:14 compute-1 nova_compute[183403]: 2026-01-26 15:23:14.140 183407 DEBUG nova.compute.resource_tracker [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Migration for instance 67d3bb43-d956-4beb-8227-316914d585d8 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Jan 26 15:23:14 compute-1 nova_compute[183403]: 2026-01-26 15:23:14.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:23:14 compute-1 nova_compute[183403]: 2026-01-26 15:23:14.652 183407 DEBUG nova.compute.resource_tracker [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Jan 26 15:23:14 compute-1 nova_compute[183403]: 2026-01-26 15:23:14.724 183407 DEBUG nova.compute.resource_tracker [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Migration c6e4073f-8f9c-4399-9bad-86e872650d5e is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Jan 26 15:23:14 compute-1 nova_compute[183403]: 2026-01-26 15:23:14.724 183407 DEBUG nova.compute.resource_tracker [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:23:14 compute-1 nova_compute[183403]: 2026-01-26 15:23:14.725 183407 DEBUG nova.compute.resource_tracker [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:23:13 up  1:18,  0 user,  load average: 0.29, 0.33, 0.32\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:23:14 compute-1 nova_compute[183403]: 2026-01-26 15:23:14.771 183407 DEBUG nova.compute.provider_tree [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:23:15 compute-1 nova_compute[183403]: 2026-01-26 15:23:15.029 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:23:15 compute-1 nova_compute[183403]: 2026-01-26 15:23:15.089 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:23:15 compute-1 nova_compute[183403]: 2026-01-26 15:23:15.281 183407 DEBUG nova.scheduler.client.report [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:23:15 compute-1 nova_compute[183403]: 2026-01-26 15:23:15.792 183407 DEBUG nova.compute.resource_tracker [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:23:15 compute-1 nova_compute[183403]: 2026-01-26 15:23:15.793 183407 DEBUG oslo_concurrency.lockutils [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.673s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:23:15 compute-1 nova_compute[183403]: 2026-01-26 15:23:15.799 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.710s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:23:15 compute-1 nova_compute[183403]: 2026-01-26 15:23:15.799 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:23:15 compute-1 nova_compute[183403]: 2026-01-26 15:23:15.800 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:23:15 compute-1 nova_compute[183403]: 2026-01-26 15:23:15.816 183407 INFO nova.compute.manager [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Migrating instance to compute-0.ctlplane.example.com finished successfully.
Jan 26 15:23:16 compute-1 nova_compute[183403]: 2026-01-26 15:23:16.040 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:23:16 compute-1 nova_compute[183403]: 2026-01-26 15:23:16.042 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:23:16 compute-1 nova_compute[183403]: 2026-01-26 15:23:16.074 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.033s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:23:16 compute-1 nova_compute[183403]: 2026-01-26 15:23:16.075 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5818MB free_disk=73.14473342895508GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:23:16 compute-1 nova_compute[183403]: 2026-01-26 15:23:16.076 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:23:16 compute-1 nova_compute[183403]: 2026-01-26 15:23:16.076 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:23:16 compute-1 nova_compute[183403]: 2026-01-26 15:23:16.802 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:23:16 compute-1 nova_compute[183403]: 2026-01-26 15:23:16.915 183407 INFO nova.scheduler.client.report [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Deleted allocation for migration c6e4073f-8f9c-4399-9bad-86e872650d5e
Jan 26 15:23:16 compute-1 nova_compute[183403]: 2026-01-26 15:23:16.916 183407 DEBUG nova.virt.libvirt.driver [None req-2764b6dc-b95f-4702-b47e-9d59214610da a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 67d3bb43-d956-4beb-8227-316914d585d8] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Jan 26 15:23:17 compute-1 nova_compute[183403]: 2026-01-26 15:23:17.114 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:23:17 compute-1 nova_compute[183403]: 2026-01-26 15:23:17.115 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:23:16 up  1:18,  0 user,  load average: 0.26, 0.32, 0.32\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:23:17 compute-1 nova_compute[183403]: 2026-01-26 15:23:17.141 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:23:17 compute-1 nova_compute[183403]: 2026-01-26 15:23:17.329 183407 DEBUG nova.compute.manager [None req-2a33153b-5c44-48d7-ba91-8490a57bcb3f a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:632
Jan 26 15:23:17 compute-1 nova_compute[183403]: 2026-01-26 15:23:17.383 183407 DEBUG nova.compute.provider_tree [None req-2a33153b-5c44-48d7-ba91-8490a57bcb3f a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Updating resource provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 generation from 21 to 24 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Jan 26 15:23:17 compute-1 nova_compute[183403]: 2026-01-26 15:23:17.649 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:23:17 compute-1 nova_compute[183403]: 2026-01-26 15:23:17.740 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Updating resource provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 generation from 24 to 25 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Jan 26 15:23:18 compute-1 nova_compute[183403]: 2026-01-26 15:23:18.251 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:23:18 compute-1 nova_compute[183403]: 2026-01-26 15:23:18.252 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.175s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:23:19 compute-1 openstack_network_exporter[195610]: ERROR   15:23:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:23:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:23:19 compute-1 openstack_network_exporter[195610]: ERROR   15:23:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:23:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:23:19 compute-1 podman[210223]: 2026-01-26 15:23:19.93730314 +0000 UTC m=+0.084670184 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Jan 26 15:23:19 compute-1 podman[210222]: 2026-01-26 15:23:19.945847252 +0000 UTC m=+0.109027113 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.4)
Jan 26 15:23:20 compute-1 nova_compute[183403]: 2026-01-26 15:23:20.031 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:23:20 compute-1 nova_compute[183403]: 2026-01-26 15:23:20.252 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:23:20 compute-1 nova_compute[183403]: 2026-01-26 15:23:20.253 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:23:20 compute-1 nova_compute[183403]: 2026-01-26 15:23:20.253 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:23:20 compute-1 nova_compute[183403]: 2026-01-26 15:23:20.253 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:23:20 compute-1 nova_compute[183403]: 2026-01-26 15:23:20.254 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:23:20 compute-1 nova_compute[183403]: 2026-01-26 15:23:20.254 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 15:23:21 compute-1 nova_compute[183403]: 2026-01-26 15:23:21.805 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:23:25 compute-1 nova_compute[183403]: 2026-01-26 15:23:25.034 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:23:26 compute-1 nova_compute[183403]: 2026-01-26 15:23:26.809 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:23:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:23:29.064 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:23:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:23:29.067 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:23:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:23:29.067 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:23:30 compute-1 nova_compute[183403]: 2026-01-26 15:23:30.615 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:23:31 compute-1 nova_compute[183403]: 2026-01-26 15:23:31.848 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:23:35 compute-1 podman[192725]: time="2026-01-26T15:23:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:23:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:23:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:23:35 compute-1 nova_compute[183403]: 2026-01-26 15:23:35.663 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:23:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:23:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2189 "" "Go-http-client/1.1"
Jan 26 15:23:36 compute-1 nova_compute[183403]: 2026-01-26 15:23:36.851 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:23:40 compute-1 nova_compute[183403]: 2026-01-26 15:23:40.686 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:23:41 compute-1 nova_compute[183403]: 2026-01-26 15:23:41.874 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:23:42 compute-1 podman[210269]: 2026-01-26 15:23:42.914742317 +0000 UTC m=+0.083643955 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 15:23:42 compute-1 podman[210270]: 2026-01-26 15:23:42.926554817 +0000 UTC m=+0.093517973 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.tags=minimal rhel9, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., container_name=openstack_network_exporter, name=ubi9-minimal, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 26 15:23:45 compute-1 nova_compute[183403]: 2026-01-26 15:23:45.688 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:23:46 compute-1 nova_compute[183403]: 2026-01-26 15:23:46.877 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:23:49 compute-1 openstack_network_exporter[195610]: ERROR   15:23:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:23:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:23:49 compute-1 openstack_network_exporter[195610]: ERROR   15:23:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:23:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:23:50 compute-1 nova_compute[183403]: 2026-01-26 15:23:50.692 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:23:50 compute-1 podman[210313]: 2026-01-26 15:23:50.919301535 +0000 UTC m=+0.083079005 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest)
Jan 26 15:23:50 compute-1 podman[210312]: 2026-01-26 15:23:50.966726337 +0000 UTC m=+0.133672822 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260120, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller)
Jan 26 15:23:51 compute-1 nova_compute[183403]: 2026-01-26 15:23:51.879 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:23:55 compute-1 nova_compute[183403]: 2026-01-26 15:23:55.695 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:23:56 compute-1 nova_compute[183403]: 2026-01-26 15:23:56.880 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:24:00 compute-1 nova_compute[183403]: 2026-01-26 15:24:00.698 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:24:01 compute-1 nova_compute[183403]: 2026-01-26 15:24:01.883 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:24:05 compute-1 podman[192725]: time="2026-01-26T15:24:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:24:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:24:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:24:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:24:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2191 "" "Go-http-client/1.1"
Jan 26 15:24:05 compute-1 nova_compute[183403]: 2026-01-26 15:24:05.699 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:24:06 compute-1 nova_compute[183403]: 2026-01-26 15:24:06.885 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:24:10 compute-1 nova_compute[183403]: 2026-01-26 15:24:10.702 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:24:11 compute-1 nova_compute[183403]: 2026-01-26 15:24:11.577 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:24:11 compute-1 nova_compute[183403]: 2026-01-26 15:24:11.887 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:24:13 compute-1 nova_compute[183403]: 2026-01-26 15:24:13.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:24:13 compute-1 podman[210357]: 2026-01-26 15:24:13.70724928 +0000 UTC m=+0.087409114 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 26 15:24:13 compute-1 podman[210358]: 2026-01-26 15:24:13.708028293 +0000 UTC m=+0.082610271 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.openshift.expose-services=, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public)
Jan 26 15:24:15 compute-1 nova_compute[183403]: 2026-01-26 15:24:15.703 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:24:16 compute-1 nova_compute[183403]: 2026-01-26 15:24:16.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:24:16 compute-1 nova_compute[183403]: 2026-01-26 15:24:16.890 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:24:17 compute-1 nova_compute[183403]: 2026-01-26 15:24:17.089 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:24:17 compute-1 nova_compute[183403]: 2026-01-26 15:24:17.089 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:24:17 compute-1 nova_compute[183403]: 2026-01-26 15:24:17.090 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:24:17 compute-1 nova_compute[183403]: 2026-01-26 15:24:17.090 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:24:17 compute-1 nova_compute[183403]: 2026-01-26 15:24:17.306 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:24:17 compute-1 nova_compute[183403]: 2026-01-26 15:24:17.307 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:24:17 compute-1 nova_compute[183403]: 2026-01-26 15:24:17.344 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.037s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:24:17 compute-1 nova_compute[183403]: 2026-01-26 15:24:17.345 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5848MB free_disk=73.14474105834961GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:24:17 compute-1 nova_compute[183403]: 2026-01-26 15:24:17.345 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:24:17 compute-1 nova_compute[183403]: 2026-01-26 15:24:17.345 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:24:18 compute-1 nova_compute[183403]: 2026-01-26 15:24:18.497 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:24:18 compute-1 nova_compute[183403]: 2026-01-26 15:24:18.498 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:24:17 up  1:19,  0 user,  load average: 0.09, 0.26, 0.29\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:24:18 compute-1 nova_compute[183403]: 2026-01-26 15:24:18.523 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:24:19 compute-1 nova_compute[183403]: 2026-01-26 15:24:19.033 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:24:19 compute-1 nova_compute[183403]: 2026-01-26 15:24:19.123 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Updating resource provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 generation from 25 to 26 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Jan 26 15:24:19 compute-1 openstack_network_exporter[195610]: ERROR   15:24:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:24:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:24:19 compute-1 openstack_network_exporter[195610]: ERROR   15:24:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:24:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:24:19 compute-1 nova_compute[183403]: 2026-01-26 15:24:19.651 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:24:19 compute-1 nova_compute[183403]: 2026-01-26 15:24:19.652 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.307s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:24:20 compute-1 nova_compute[183403]: 2026-01-26 15:24:20.652 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:24:20 compute-1 nova_compute[183403]: 2026-01-26 15:24:20.652 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:24:20 compute-1 nova_compute[183403]: 2026-01-26 15:24:20.653 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:24:20 compute-1 nova_compute[183403]: 2026-01-26 15:24:20.653 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:24:20 compute-1 nova_compute[183403]: 2026-01-26 15:24:20.653 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:24:20 compute-1 nova_compute[183403]: 2026-01-26 15:24:20.653 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 15:24:20 compute-1 nova_compute[183403]: 2026-01-26 15:24:20.706 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:24:21 compute-1 nova_compute[183403]: 2026-01-26 15:24:21.892 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:24:21 compute-1 podman[210404]: 2026-01-26 15:24:21.935618307 +0000 UTC m=+0.095699151 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Jan 26 15:24:21 compute-1 podman[210403]: 2026-01-26 15:24:21.987956766 +0000 UTC m=+0.151904765 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 26 15:24:24 compute-1 nova_compute[183403]: 2026-01-26 15:24:24.572 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:24:25 compute-1 nova_compute[183403]: 2026-01-26 15:24:25.708 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:24:26 compute-1 ovn_controller[95641]: 2026-01-26T15:24:26Z|00152|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Jan 26 15:24:26 compute-1 nova_compute[183403]: 2026-01-26 15:24:26.897 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:24:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:24:29.069 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:24:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:24:29.069 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:24:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:24:29.069 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:24:30 compute-1 nova_compute[183403]: 2026-01-26 15:24:30.713 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:24:31 compute-1 nova_compute[183403]: 2026-01-26 15:24:31.900 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:24:35 compute-1 nova_compute[183403]: 2026-01-26 15:24:35.579 183407 DEBUG oslo_concurrency.lockutils [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Acquiring lock "21c0fdc0-71e1-403c-a34a-fce881dd6046" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:24:35 compute-1 nova_compute[183403]: 2026-01-26 15:24:35.580 183407 DEBUG oslo_concurrency.lockutils [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Lock "21c0fdc0-71e1-403c-a34a-fce881dd6046" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:24:35 compute-1 podman[192725]: time="2026-01-26T15:24:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:24:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:24:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:24:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:24:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2191 "" "Go-http-client/1.1"
Jan 26 15:24:35 compute-1 nova_compute[183403]: 2026-01-26 15:24:35.716 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:24:36 compute-1 nova_compute[183403]: 2026-01-26 15:24:36.087 183407 DEBUG nova.compute.manager [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Jan 26 15:24:36 compute-1 nova_compute[183403]: 2026-01-26 15:24:36.654 183407 DEBUG oslo_concurrency.lockutils [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:24:36 compute-1 nova_compute[183403]: 2026-01-26 15:24:36.655 183407 DEBUG oslo_concurrency.lockutils [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:24:36 compute-1 nova_compute[183403]: 2026-01-26 15:24:36.667 183407 DEBUG nova.virt.hardware [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Jan 26 15:24:36 compute-1 nova_compute[183403]: 2026-01-26 15:24:36.668 183407 INFO nova.compute.claims [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Claim successful on node compute-1.ctlplane.example.com
Jan 26 15:24:36 compute-1 nova_compute[183403]: 2026-01-26 15:24:36.902 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:24:37 compute-1 nova_compute[183403]: 2026-01-26 15:24:37.739 183407 DEBUG nova.compute.provider_tree [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:24:38 compute-1 nova_compute[183403]: 2026-01-26 15:24:38.248 183407 DEBUG nova.scheduler.client.report [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:24:38 compute-1 nova_compute[183403]: 2026-01-26 15:24:38.758 183407 DEBUG oslo_concurrency.lockutils [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.103s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:24:38 compute-1 nova_compute[183403]: 2026-01-26 15:24:38.760 183407 DEBUG nova.compute.manager [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Jan 26 15:24:39 compute-1 nova_compute[183403]: 2026-01-26 15:24:39.279 183407 DEBUG nova.compute.manager [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Jan 26 15:24:39 compute-1 nova_compute[183403]: 2026-01-26 15:24:39.279 183407 DEBUG nova.network.neutron [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Jan 26 15:24:39 compute-1 nova_compute[183403]: 2026-01-26 15:24:39.280 183407 WARNING neutronclient.v2_0.client [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:24:39 compute-1 nova_compute[183403]: 2026-01-26 15:24:39.280 183407 WARNING neutronclient.v2_0.client [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:24:39 compute-1 nova_compute[183403]: 2026-01-26 15:24:39.789 183407 INFO nova.virt.libvirt.driver [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 15:24:40 compute-1 nova_compute[183403]: 2026-01-26 15:24:40.464 183407 DEBUG nova.compute.manager [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Jan 26 15:24:40 compute-1 nova_compute[183403]: 2026-01-26 15:24:40.721 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:24:40 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:24:40.956 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:ac:38', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'de:96:40:90:f3:e5'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:24:40 compute-1 nova_compute[183403]: 2026-01-26 15:24:40.956 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:24:40 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:24:40.957 104930 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 15:24:41 compute-1 nova_compute[183403]: 2026-01-26 15:24:41.904 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:24:41 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:24:41.958 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=41380b5a-e321-4ce4-bcc6-ecd563b3c793, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:24:41 compute-1 nova_compute[183403]: 2026-01-26 15:24:41.987 183407 DEBUG nova.compute.manager [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Jan 26 15:24:41 compute-1 nova_compute[183403]: 2026-01-26 15:24:41.988 183407 DEBUG nova.virt.libvirt.driver [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Jan 26 15:24:41 compute-1 nova_compute[183403]: 2026-01-26 15:24:41.989 183407 INFO nova.virt.libvirt.driver [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Creating image(s)
Jan 26 15:24:41 compute-1 nova_compute[183403]: 2026-01-26 15:24:41.989 183407 DEBUG oslo_concurrency.lockutils [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Acquiring lock "/var/lib/nova/instances/21c0fdc0-71e1-403c-a34a-fce881dd6046/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:24:41 compute-1 nova_compute[183403]: 2026-01-26 15:24:41.990 183407 DEBUG oslo_concurrency.lockutils [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Lock "/var/lib/nova/instances/21c0fdc0-71e1-403c-a34a-fce881dd6046/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:24:41 compute-1 nova_compute[183403]: 2026-01-26 15:24:41.991 183407 DEBUG oslo_concurrency.lockutils [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Lock "/var/lib/nova/instances/21c0fdc0-71e1-403c-a34a-fce881dd6046/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:24:41 compute-1 nova_compute[183403]: 2026-01-26 15:24:41.991 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:24:41 compute-1 nova_compute[183403]: 2026-01-26 15:24:41.995 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:24:41 compute-1 nova_compute[183403]: 2026-01-26 15:24:41.997 183407 DEBUG oslo_concurrency.processutils [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:24:42 compute-1 nova_compute[183403]: 2026-01-26 15:24:42.005 183407 DEBUG nova.network.neutron [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Successfully created port: 8a736c48-af6c-4bb5-89e7-34d3cd2654df _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Jan 26 15:24:42 compute-1 nova_compute[183403]: 2026-01-26 15:24:42.070 183407 DEBUG oslo_concurrency.processutils [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:24:42 compute-1 nova_compute[183403]: 2026-01-26 15:24:42.072 183407 DEBUG oslo_concurrency.lockutils [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Acquiring lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:24:42 compute-1 nova_compute[183403]: 2026-01-26 15:24:42.072 183407 DEBUG oslo_concurrency.lockutils [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:24:42 compute-1 nova_compute[183403]: 2026-01-26 15:24:42.073 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:24:42 compute-1 nova_compute[183403]: 2026-01-26 15:24:42.079 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:24:42 compute-1 nova_compute[183403]: 2026-01-26 15:24:42.080 183407 DEBUG oslo_concurrency.processutils [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:24:42 compute-1 nova_compute[183403]: 2026-01-26 15:24:42.134 183407 DEBUG oslo_concurrency.processutils [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:24:42 compute-1 nova_compute[183403]: 2026-01-26 15:24:42.136 183407 DEBUG oslo_concurrency.processutils [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0,backing_fmt=raw /var/lib/nova/instances/21c0fdc0-71e1-403c-a34a-fce881dd6046/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:24:42 compute-1 nova_compute[183403]: 2026-01-26 15:24:42.173 183407 DEBUG oslo_concurrency.processutils [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0,backing_fmt=raw /var/lib/nova/instances/21c0fdc0-71e1-403c-a34a-fce881dd6046/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:24:42 compute-1 nova_compute[183403]: 2026-01-26 15:24:42.175 183407 DEBUG oslo_concurrency.lockutils [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.102s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:24:42 compute-1 nova_compute[183403]: 2026-01-26 15:24:42.175 183407 DEBUG oslo_concurrency.processutils [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:24:42 compute-1 nova_compute[183403]: 2026-01-26 15:24:42.256 183407 DEBUG oslo_concurrency.processutils [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:24:42 compute-1 nova_compute[183403]: 2026-01-26 15:24:42.258 183407 DEBUG nova.virt.disk.api [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Checking if we can resize image /var/lib/nova/instances/21c0fdc0-71e1-403c-a34a-fce881dd6046/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 26 15:24:42 compute-1 nova_compute[183403]: 2026-01-26 15:24:42.259 183407 DEBUG oslo_concurrency.processutils [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/21c0fdc0-71e1-403c-a34a-fce881dd6046/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:24:42 compute-1 nova_compute[183403]: 2026-01-26 15:24:42.324 183407 DEBUG oslo_concurrency.processutils [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/21c0fdc0-71e1-403c-a34a-fce881dd6046/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:24:42 compute-1 nova_compute[183403]: 2026-01-26 15:24:42.325 183407 DEBUG nova.virt.disk.api [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Cannot resize image /var/lib/nova/instances/21c0fdc0-71e1-403c-a34a-fce881dd6046/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 26 15:24:42 compute-1 nova_compute[183403]: 2026-01-26 15:24:42.326 183407 DEBUG nova.virt.libvirt.driver [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Jan 26 15:24:42 compute-1 nova_compute[183403]: 2026-01-26 15:24:42.326 183407 DEBUG nova.virt.libvirt.driver [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Ensure instance console log exists: /var/lib/nova/instances/21c0fdc0-71e1-403c-a34a-fce881dd6046/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Jan 26 15:24:42 compute-1 nova_compute[183403]: 2026-01-26 15:24:42.327 183407 DEBUG oslo_concurrency.lockutils [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:24:42 compute-1 nova_compute[183403]: 2026-01-26 15:24:42.327 183407 DEBUG oslo_concurrency.lockutils [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:24:42 compute-1 nova_compute[183403]: 2026-01-26 15:24:42.328 183407 DEBUG oslo_concurrency.lockutils [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:24:43 compute-1 podman[210465]: 2026-01-26 15:24:43.888258366 +0000 UTC m=+0.063067799 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.openshift.expose-services=, release=1755695350, vcs-type=git, architecture=x86_64, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 26 15:24:43 compute-1 podman[210464]: 2026-01-26 15:24:43.901222363 +0000 UTC m=+0.080938571 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 15:24:44 compute-1 nova_compute[183403]: 2026-01-26 15:24:44.272 183407 DEBUG nova.compute.manager [req-8c24dac0-eaa6-44d2-a4ca-830f7d636abf req-4618209e-656b-4d9a-b783-90552faa2909 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Received event network-changed-8a736c48-af6c-4bb5-89e7-34d3cd2654df external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:24:44 compute-1 nova_compute[183403]: 2026-01-26 15:24:44.273 183407 DEBUG nova.compute.manager [req-8c24dac0-eaa6-44d2-a4ca-830f7d636abf req-4618209e-656b-4d9a-b783-90552faa2909 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Refreshing instance network info cache due to event network-changed-8a736c48-af6c-4bb5-89e7-34d3cd2654df. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 26 15:24:44 compute-1 nova_compute[183403]: 2026-01-26 15:24:44.273 183407 DEBUG oslo_concurrency.lockutils [req-8c24dac0-eaa6-44d2-a4ca-830f7d636abf req-4618209e-656b-4d9a-b783-90552faa2909 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "refresh_cache-21c0fdc0-71e1-403c-a34a-fce881dd6046" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 15:24:44 compute-1 nova_compute[183403]: 2026-01-26 15:24:44.274 183407 DEBUG oslo_concurrency.lockutils [req-8c24dac0-eaa6-44d2-a4ca-830f7d636abf req-4618209e-656b-4d9a-b783-90552faa2909 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquired lock "refresh_cache-21c0fdc0-71e1-403c-a34a-fce881dd6046" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 15:24:44 compute-1 nova_compute[183403]: 2026-01-26 15:24:44.274 183407 DEBUG nova.network.neutron [req-8c24dac0-eaa6-44d2-a4ca-830f7d636abf req-4618209e-656b-4d9a-b783-90552faa2909 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Refreshing network info cache for port 8a736c48-af6c-4bb5-89e7-34d3cd2654df _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 26 15:24:44 compute-1 nova_compute[183403]: 2026-01-26 15:24:44.278 183407 DEBUG nova.network.neutron [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Successfully updated port: 8a736c48-af6c-4bb5-89e7-34d3cd2654df _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Jan 26 15:24:45 compute-1 nova_compute[183403]: 2026-01-26 15:24:45.030 183407 WARNING neutronclient.v2_0.client [req-8c24dac0-eaa6-44d2-a4ca-830f7d636abf req-4618209e-656b-4d9a-b783-90552faa2909 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:24:45 compute-1 nova_compute[183403]: 2026-01-26 15:24:45.040 183407 DEBUG oslo_concurrency.lockutils [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Acquiring lock "refresh_cache-21c0fdc0-71e1-403c-a34a-fce881dd6046" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 15:24:45 compute-1 nova_compute[183403]: 2026-01-26 15:24:45.341 183407 DEBUG nova.network.neutron [req-8c24dac0-eaa6-44d2-a4ca-830f7d636abf req-4618209e-656b-4d9a-b783-90552faa2909 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 15:24:45 compute-1 nova_compute[183403]: 2026-01-26 15:24:45.575 183407 DEBUG nova.network.neutron [req-8c24dac0-eaa6-44d2-a4ca-830f7d636abf req-4618209e-656b-4d9a-b783-90552faa2909 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:24:45 compute-1 nova_compute[183403]: 2026-01-26 15:24:45.722 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:24:46 compute-1 nova_compute[183403]: 2026-01-26 15:24:46.110 183407 DEBUG oslo_concurrency.lockutils [req-8c24dac0-eaa6-44d2-a4ca-830f7d636abf req-4618209e-656b-4d9a-b783-90552faa2909 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Releasing lock "refresh_cache-21c0fdc0-71e1-403c-a34a-fce881dd6046" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 15:24:46 compute-1 nova_compute[183403]: 2026-01-26 15:24:46.111 183407 DEBUG oslo_concurrency.lockutils [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Acquired lock "refresh_cache-21c0fdc0-71e1-403c-a34a-fce881dd6046" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 15:24:46 compute-1 nova_compute[183403]: 2026-01-26 15:24:46.112 183407 DEBUG nova.network.neutron [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 15:24:46 compute-1 nova_compute[183403]: 2026-01-26 15:24:46.906 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:24:46 compute-1 nova_compute[183403]: 2026-01-26 15:24:46.914 183407 DEBUG nova.network.neutron [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 15:24:47 compute-1 nova_compute[183403]: 2026-01-26 15:24:47.099 183407 WARNING neutronclient.v2_0.client [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:24:48 compute-1 nova_compute[183403]: 2026-01-26 15:24:48.068 183407 DEBUG nova.network.neutron [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Updating instance_info_cache with network_info: [{"id": "8a736c48-af6c-4bb5-89e7-34d3cd2654df", "address": "fa:16:3e:d7:ea:cd", "network": {"id": "3e9dfa8e-4100-40c2-b5c3-611e27e3b601", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-854362176-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9147aacc62b34487b1727939f1a92703", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a736c48-af", "ovs_interfaceid": "8a736c48-af6c-4bb5-89e7-34d3cd2654df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:24:48 compute-1 nova_compute[183403]: 2026-01-26 15:24:48.579 183407 DEBUG oslo_concurrency.lockutils [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Releasing lock "refresh_cache-21c0fdc0-71e1-403c-a34a-fce881dd6046" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 15:24:48 compute-1 nova_compute[183403]: 2026-01-26 15:24:48.580 183407 DEBUG nova.compute.manager [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Instance network_info: |[{"id": "8a736c48-af6c-4bb5-89e7-34d3cd2654df", "address": "fa:16:3e:d7:ea:cd", "network": {"id": "3e9dfa8e-4100-40c2-b5c3-611e27e3b601", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-854362176-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9147aacc62b34487b1727939f1a92703", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a736c48-af", "ovs_interfaceid": "8a736c48-af6c-4bb5-89e7-34d3cd2654df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Jan 26 15:24:48 compute-1 nova_compute[183403]: 2026-01-26 15:24:48.584 183407 DEBUG nova.virt.libvirt.driver [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Start _get_guest_xml network_info=[{"id": "8a736c48-af6c-4bb5-89e7-34d3cd2654df", "address": "fa:16:3e:d7:ea:cd", "network": {"id": "3e9dfa8e-4100-40c2-b5c3-611e27e3b601", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-854362176-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9147aacc62b34487b1727939f1a92703", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a736c48-af", "ovs_interfaceid": "8a736c48-af6c-4bb5-89e7-34d3cd2654df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T15:01:22Z,direct_url=<?>,disk_format='qcow2',id=354e4d0e-4287-404f-93d3-2c85cfe92fbc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='179f3c996d8f4e7ea1b0aca3ec76f02e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T15:01:23Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'image_id': '354e4d0e-4287-404f-93d3-2c85cfe92fbc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Jan 26 15:24:48 compute-1 nova_compute[183403]: 2026-01-26 15:24:48.589 183407 WARNING nova.virt.libvirt.driver [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:24:48 compute-1 nova_compute[183403]: 2026-01-26 15:24:48.591 183407 DEBUG nova.virt.driver [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='354e4d0e-4287-404f-93d3-2c85cfe92fbc', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-2057709501', uuid='21c0fdc0-71e1-403c-a34a-fce881dd6046'), owner=OwnerMeta(userid='0c77b3ed882642e3b0c7840dc8efc49a', username='tempest-TestExecuteNodeResourceConsolidationStrategy-1702173696-project-admin', projectid='e07345fa9028494086d0d062e5c6d037', projectname='tempest-TestExecuteNodeResourceConsolidationStrategy-1702173696'), image=ImageMeta(id='354e4d0e-4287-404f-93d3-2c85cfe92fbc', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='74480e15-23e6-4569-8ef9-3ddf5ac8b981', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "8a736c48-af6c-4bb5-89e7-34d3cd2654df", "address": "fa:16:3e:d7:ea:cd", "network": {"id": "3e9dfa8e-4100-40c2-b5c3-611e27e3b601", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-854362176-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9147aacc62b34487b1727939f1a92703", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a736c48-af", "ovs_interfaceid": "8a736c48-af6c-4bb5-89e7-34d3cd2654df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1769441088.5913427) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Jan 26 15:24:48 compute-1 nova_compute[183403]: 2026-01-26 15:24:48.597 183407 DEBUG nova.virt.libvirt.host [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Jan 26 15:24:48 compute-1 nova_compute[183403]: 2026-01-26 15:24:48.599 183407 DEBUG nova.virt.libvirt.host [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Jan 26 15:24:48 compute-1 nova_compute[183403]: 2026-01-26 15:24:48.609 183407 DEBUG nova.virt.libvirt.host [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Jan 26 15:24:48 compute-1 nova_compute[183403]: 2026-01-26 15:24:48.610 183407 DEBUG nova.virt.libvirt.host [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Jan 26 15:24:48 compute-1 nova_compute[183403]: 2026-01-26 15:24:48.612 183407 DEBUG nova.virt.libvirt.driver [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Jan 26 15:24:48 compute-1 nova_compute[183403]: 2026-01-26 15:24:48.612 183407 DEBUG nova.virt.hardware [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T15:01:21Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='74480e15-23e6-4569-8ef9-3ddf5ac8b981',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T15:01:22Z,direct_url=<?>,disk_format='qcow2',id=354e4d0e-4287-404f-93d3-2c85cfe92fbc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='179f3c996d8f4e7ea1b0aca3ec76f02e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T15:01:23Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Jan 26 15:24:48 compute-1 nova_compute[183403]: 2026-01-26 15:24:48.613 183407 DEBUG nova.virt.hardware [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Jan 26 15:24:48 compute-1 nova_compute[183403]: 2026-01-26 15:24:48.613 183407 DEBUG nova.virt.hardware [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Jan 26 15:24:48 compute-1 nova_compute[183403]: 2026-01-26 15:24:48.614 183407 DEBUG nova.virt.hardware [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Jan 26 15:24:48 compute-1 nova_compute[183403]: 2026-01-26 15:24:48.614 183407 DEBUG nova.virt.hardware [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Jan 26 15:24:48 compute-1 nova_compute[183403]: 2026-01-26 15:24:48.614 183407 DEBUG nova.virt.hardware [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Jan 26 15:24:48 compute-1 nova_compute[183403]: 2026-01-26 15:24:48.615 183407 DEBUG nova.virt.hardware [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Jan 26 15:24:48 compute-1 nova_compute[183403]: 2026-01-26 15:24:48.615 183407 DEBUG nova.virt.hardware [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Jan 26 15:24:48 compute-1 nova_compute[183403]: 2026-01-26 15:24:48.616 183407 DEBUG nova.virt.hardware [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Jan 26 15:24:48 compute-1 nova_compute[183403]: 2026-01-26 15:24:48.616 183407 DEBUG nova.virt.hardware [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Jan 26 15:24:48 compute-1 nova_compute[183403]: 2026-01-26 15:24:48.616 183407 DEBUG nova.virt.hardware [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Jan 26 15:24:48 compute-1 nova_compute[183403]: 2026-01-26 15:24:48.622 183407 DEBUG nova.virt.libvirt.vif [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-01-26T15:24:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-2057709501',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-205',id=19,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e07345fa9028494086d0d062e5c6d037',ramdisk_id='',reservation_id='r-e0xp661h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1702173696',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1702173696-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T15:24:40Z,user_data=None,user_id='0c77b3ed882642e3b0c7840dc8efc49a',uuid=21c0fdc0-71e1-403c-a34a-fce881dd6046,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8a736c48-af6c-4bb5-89e7-34d3cd2654df", "address": "fa:16:3e:d7:ea:cd", "network": {"id": "3e9dfa8e-4100-40c2-b5c3-611e27e3b601", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-854362176-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9147aacc62b34487b1727939f1a92703", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a736c48-af", "ovs_interfaceid": "8a736c48-af6c-4bb5-89e7-34d3cd2654df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 26 15:24:48 compute-1 nova_compute[183403]: 2026-01-26 15:24:48.623 183407 DEBUG nova.network.os_vif_util [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Converting VIF {"id": "8a736c48-af6c-4bb5-89e7-34d3cd2654df", "address": "fa:16:3e:d7:ea:cd", "network": {"id": "3e9dfa8e-4100-40c2-b5c3-611e27e3b601", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-854362176-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9147aacc62b34487b1727939f1a92703", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a736c48-af", "ovs_interfaceid": "8a736c48-af6c-4bb5-89e7-34d3cd2654df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:24:48 compute-1 nova_compute[183403]: 2026-01-26 15:24:48.624 183407 DEBUG nova.network.os_vif_util [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d7:ea:cd,bridge_name='br-int',has_traffic_filtering=True,id=8a736c48-af6c-4bb5-89e7-34d3cd2654df,network=Network(3e9dfa8e-4100-40c2-b5c3-611e27e3b601),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a736c48-af') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:24:48 compute-1 nova_compute[183403]: 2026-01-26 15:24:48.626 183407 DEBUG nova.objects.instance [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Lazy-loading 'pci_devices' on Instance uuid 21c0fdc0-71e1-403c-a34a-fce881dd6046 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:24:49 compute-1 nova_compute[183403]: 2026-01-26 15:24:49.260 183407 DEBUG nova.virt.libvirt.driver [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] End _get_guest_xml xml=<domain type="kvm">
Jan 26 15:24:49 compute-1 nova_compute[183403]:   <uuid>21c0fdc0-71e1-403c-a34a-fce881dd6046</uuid>
Jan 26 15:24:49 compute-1 nova_compute[183403]:   <name>instance-00000013</name>
Jan 26 15:24:49 compute-1 nova_compute[183403]:   <memory>131072</memory>
Jan 26 15:24:49 compute-1 nova_compute[183403]:   <vcpu>1</vcpu>
Jan 26 15:24:49 compute-1 nova_compute[183403]:   <metadata>
Jan 26 15:24:49 compute-1 nova_compute[183403]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 15:24:49 compute-1 nova_compute[183403]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 15:24:49 compute-1 nova_compute[183403]:       <nova:name>tempest-TestExecuteNodeResourceConsolidationStrategy-server-2057709501</nova:name>
Jan 26 15:24:49 compute-1 nova_compute[183403]:       <nova:creationTime>2026-01-26 15:24:48</nova:creationTime>
Jan 26 15:24:49 compute-1 nova_compute[183403]:       <nova:flavor name="m1.nano" id="74480e15-23e6-4569-8ef9-3ddf5ac8b981">
Jan 26 15:24:49 compute-1 nova_compute[183403]:         <nova:memory>128</nova:memory>
Jan 26 15:24:49 compute-1 nova_compute[183403]:         <nova:disk>1</nova:disk>
Jan 26 15:24:49 compute-1 nova_compute[183403]:         <nova:swap>0</nova:swap>
Jan 26 15:24:49 compute-1 nova_compute[183403]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 15:24:49 compute-1 nova_compute[183403]:         <nova:vcpus>1</nova:vcpus>
Jan 26 15:24:49 compute-1 nova_compute[183403]:         <nova:extraSpecs>
Jan 26 15:24:49 compute-1 nova_compute[183403]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 15:24:49 compute-1 nova_compute[183403]:         </nova:extraSpecs>
Jan 26 15:24:49 compute-1 nova_compute[183403]:       </nova:flavor>
Jan 26 15:24:49 compute-1 nova_compute[183403]:       <nova:image uuid="354e4d0e-4287-404f-93d3-2c85cfe92fbc">
Jan 26 15:24:49 compute-1 nova_compute[183403]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 15:24:49 compute-1 nova_compute[183403]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 15:24:49 compute-1 nova_compute[183403]:         <nova:minDisk>1</nova:minDisk>
Jan 26 15:24:49 compute-1 nova_compute[183403]:         <nova:minRam>0</nova:minRam>
Jan 26 15:24:49 compute-1 nova_compute[183403]:         <nova:properties>
Jan 26 15:24:49 compute-1 nova_compute[183403]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 15:24:49 compute-1 nova_compute[183403]:         </nova:properties>
Jan 26 15:24:49 compute-1 nova_compute[183403]:       </nova:image>
Jan 26 15:24:49 compute-1 nova_compute[183403]:       <nova:owner>
Jan 26 15:24:49 compute-1 nova_compute[183403]:         <nova:user uuid="0c77b3ed882642e3b0c7840dc8efc49a">tempest-TestExecuteNodeResourceConsolidationStrategy-1702173696-project-admin</nova:user>
Jan 26 15:24:49 compute-1 nova_compute[183403]:         <nova:project uuid="e07345fa9028494086d0d062e5c6d037">tempest-TestExecuteNodeResourceConsolidationStrategy-1702173696</nova:project>
Jan 26 15:24:49 compute-1 nova_compute[183403]:       </nova:owner>
Jan 26 15:24:49 compute-1 nova_compute[183403]:       <nova:root type="image" uuid="354e4d0e-4287-404f-93d3-2c85cfe92fbc"/>
Jan 26 15:24:49 compute-1 nova_compute[183403]:       <nova:ports>
Jan 26 15:24:49 compute-1 nova_compute[183403]:         <nova:port uuid="8a736c48-af6c-4bb5-89e7-34d3cd2654df">
Jan 26 15:24:49 compute-1 nova_compute[183403]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 26 15:24:49 compute-1 nova_compute[183403]:         </nova:port>
Jan 26 15:24:49 compute-1 nova_compute[183403]:       </nova:ports>
Jan 26 15:24:49 compute-1 nova_compute[183403]:     </nova:instance>
Jan 26 15:24:49 compute-1 nova_compute[183403]:   </metadata>
Jan 26 15:24:49 compute-1 nova_compute[183403]:   <sysinfo type="smbios">
Jan 26 15:24:49 compute-1 nova_compute[183403]:     <system>
Jan 26 15:24:49 compute-1 nova_compute[183403]:       <entry name="manufacturer">RDO</entry>
Jan 26 15:24:49 compute-1 nova_compute[183403]:       <entry name="product">OpenStack Compute</entry>
Jan 26 15:24:49 compute-1 nova_compute[183403]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 15:24:49 compute-1 nova_compute[183403]:       <entry name="serial">21c0fdc0-71e1-403c-a34a-fce881dd6046</entry>
Jan 26 15:24:49 compute-1 nova_compute[183403]:       <entry name="uuid">21c0fdc0-71e1-403c-a34a-fce881dd6046</entry>
Jan 26 15:24:49 compute-1 nova_compute[183403]:       <entry name="family">Virtual Machine</entry>
Jan 26 15:24:49 compute-1 nova_compute[183403]:     </system>
Jan 26 15:24:49 compute-1 nova_compute[183403]:   </sysinfo>
Jan 26 15:24:49 compute-1 nova_compute[183403]:   <os>
Jan 26 15:24:49 compute-1 nova_compute[183403]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 15:24:49 compute-1 nova_compute[183403]:     <boot dev="hd"/>
Jan 26 15:24:49 compute-1 nova_compute[183403]:     <smbios mode="sysinfo"/>
Jan 26 15:24:49 compute-1 nova_compute[183403]:   </os>
Jan 26 15:24:49 compute-1 nova_compute[183403]:   <features>
Jan 26 15:24:49 compute-1 nova_compute[183403]:     <acpi/>
Jan 26 15:24:49 compute-1 nova_compute[183403]:     <apic/>
Jan 26 15:24:49 compute-1 nova_compute[183403]:     <vmcoreinfo/>
Jan 26 15:24:49 compute-1 nova_compute[183403]:   </features>
Jan 26 15:24:49 compute-1 nova_compute[183403]:   <clock offset="utc">
Jan 26 15:24:49 compute-1 nova_compute[183403]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 15:24:49 compute-1 nova_compute[183403]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 15:24:49 compute-1 nova_compute[183403]:     <timer name="hpet" present="no"/>
Jan 26 15:24:49 compute-1 nova_compute[183403]:   </clock>
Jan 26 15:24:49 compute-1 nova_compute[183403]:   <cpu mode="custom" match="exact">
Jan 26 15:24:49 compute-1 nova_compute[183403]:     <model>Nehalem</model>
Jan 26 15:24:49 compute-1 nova_compute[183403]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 15:24:49 compute-1 nova_compute[183403]:   </cpu>
Jan 26 15:24:49 compute-1 nova_compute[183403]:   <devices>
Jan 26 15:24:49 compute-1 nova_compute[183403]:     <disk type="file" device="disk">
Jan 26 15:24:49 compute-1 nova_compute[183403]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 15:24:49 compute-1 nova_compute[183403]:       <source file="/var/lib/nova/instances/21c0fdc0-71e1-403c-a34a-fce881dd6046/disk"/>
Jan 26 15:24:49 compute-1 nova_compute[183403]:       <target dev="vda" bus="virtio"/>
Jan 26 15:24:49 compute-1 nova_compute[183403]:     </disk>
Jan 26 15:24:49 compute-1 nova_compute[183403]:     <disk type="file" device="cdrom">
Jan 26 15:24:49 compute-1 nova_compute[183403]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 15:24:49 compute-1 nova_compute[183403]:       <source file="/var/lib/nova/instances/21c0fdc0-71e1-403c-a34a-fce881dd6046/disk.config"/>
Jan 26 15:24:49 compute-1 nova_compute[183403]:       <target dev="sda" bus="sata"/>
Jan 26 15:24:49 compute-1 nova_compute[183403]:     </disk>
Jan 26 15:24:49 compute-1 nova_compute[183403]:     <interface type="ethernet">
Jan 26 15:24:49 compute-1 nova_compute[183403]:       <mac address="fa:16:3e:d7:ea:cd"/>
Jan 26 15:24:49 compute-1 nova_compute[183403]:       <model type="virtio"/>
Jan 26 15:24:49 compute-1 nova_compute[183403]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 15:24:49 compute-1 nova_compute[183403]:       <mtu size="1442"/>
Jan 26 15:24:49 compute-1 nova_compute[183403]:       <target dev="tap8a736c48-af"/>
Jan 26 15:24:49 compute-1 nova_compute[183403]:     </interface>
Jan 26 15:24:49 compute-1 nova_compute[183403]:     <serial type="pty">
Jan 26 15:24:49 compute-1 nova_compute[183403]:       <log file="/var/lib/nova/instances/21c0fdc0-71e1-403c-a34a-fce881dd6046/console.log" append="off"/>
Jan 26 15:24:49 compute-1 nova_compute[183403]:     </serial>
Jan 26 15:24:49 compute-1 nova_compute[183403]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 15:24:49 compute-1 nova_compute[183403]:     <video>
Jan 26 15:24:49 compute-1 nova_compute[183403]:       <model type="virtio"/>
Jan 26 15:24:49 compute-1 nova_compute[183403]:     </video>
Jan 26 15:24:49 compute-1 nova_compute[183403]:     <input type="tablet" bus="usb"/>
Jan 26 15:24:49 compute-1 nova_compute[183403]:     <rng model="virtio">
Jan 26 15:24:49 compute-1 nova_compute[183403]:       <backend model="random">/dev/urandom</backend>
Jan 26 15:24:49 compute-1 nova_compute[183403]:     </rng>
Jan 26 15:24:49 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root"/>
Jan 26 15:24:49 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:24:49 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:24:49 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:24:49 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:24:49 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:24:49 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:24:49 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:24:49 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:24:49 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:24:49 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:24:49 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:24:49 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:24:49 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:24:49 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:24:49 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:24:49 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:24:49 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:24:49 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:24:49 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:24:49 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:24:49 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:24:49 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:24:49 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:24:49 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:24:49 compute-1 nova_compute[183403]:     <controller type="usb" index="0"/>
Jan 26 15:24:49 compute-1 nova_compute[183403]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 15:24:49 compute-1 nova_compute[183403]:       <stats period="10"/>
Jan 26 15:24:49 compute-1 nova_compute[183403]:     </memballoon>
Jan 26 15:24:49 compute-1 nova_compute[183403]:   </devices>
Jan 26 15:24:49 compute-1 nova_compute[183403]: </domain>
Jan 26 15:24:49 compute-1 nova_compute[183403]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Jan 26 15:24:49 compute-1 nova_compute[183403]: 2026-01-26 15:24:49.263 183407 DEBUG nova.compute.manager [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Preparing to wait for external event network-vif-plugged-8a736c48-af6c-4bb5-89e7-34d3cd2654df prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 26 15:24:49 compute-1 nova_compute[183403]: 2026-01-26 15:24:49.263 183407 DEBUG oslo_concurrency.lockutils [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Acquiring lock "21c0fdc0-71e1-403c-a34a-fce881dd6046-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:24:49 compute-1 nova_compute[183403]: 2026-01-26 15:24:49.264 183407 DEBUG oslo_concurrency.lockutils [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Lock "21c0fdc0-71e1-403c-a34a-fce881dd6046-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:24:49 compute-1 nova_compute[183403]: 2026-01-26 15:24:49.264 183407 DEBUG oslo_concurrency.lockutils [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Lock "21c0fdc0-71e1-403c-a34a-fce881dd6046-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:24:49 compute-1 nova_compute[183403]: 2026-01-26 15:24:49.266 183407 DEBUG nova.virt.libvirt.vif [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-01-26T15:24:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-2057709501',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-205',id=19,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e07345fa9028494086d0d062e5c6d037',ramdisk_id='',reservation_id='r-e0xp661h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1702173696',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1702173696-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T15:24:40Z,user_data=None,user_id='0c77b3ed882642e3b0c7840dc8efc49a',uuid=21c0fdc0-71e1-403c-a34a-fce881dd6046,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8a736c48-af6c-4bb5-89e7-34d3cd2654df", "address": "fa:16:3e:d7:ea:cd", "network": {"id": "3e9dfa8e-4100-40c2-b5c3-611e27e3b601", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-854362176-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9147aacc62b34487b1727939f1a92703", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a736c48-af", "ovs_interfaceid": "8a736c48-af6c-4bb5-89e7-34d3cd2654df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 26 15:24:49 compute-1 nova_compute[183403]: 2026-01-26 15:24:49.266 183407 DEBUG nova.network.os_vif_util [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Converting VIF {"id": "8a736c48-af6c-4bb5-89e7-34d3cd2654df", "address": "fa:16:3e:d7:ea:cd", "network": {"id": "3e9dfa8e-4100-40c2-b5c3-611e27e3b601", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-854362176-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9147aacc62b34487b1727939f1a92703", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a736c48-af", "ovs_interfaceid": "8a736c48-af6c-4bb5-89e7-34d3cd2654df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:24:49 compute-1 nova_compute[183403]: 2026-01-26 15:24:49.267 183407 DEBUG nova.network.os_vif_util [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d7:ea:cd,bridge_name='br-int',has_traffic_filtering=True,id=8a736c48-af6c-4bb5-89e7-34d3cd2654df,network=Network(3e9dfa8e-4100-40c2-b5c3-611e27e3b601),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a736c48-af') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:24:49 compute-1 nova_compute[183403]: 2026-01-26 15:24:49.268 183407 DEBUG os_vif [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d7:ea:cd,bridge_name='br-int',has_traffic_filtering=True,id=8a736c48-af6c-4bb5-89e7-34d3cd2654df,network=Network(3e9dfa8e-4100-40c2-b5c3-611e27e3b601),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a736c48-af') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 26 15:24:49 compute-1 nova_compute[183403]: 2026-01-26 15:24:49.269 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:24:49 compute-1 nova_compute[183403]: 2026-01-26 15:24:49.269 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:24:49 compute-1 nova_compute[183403]: 2026-01-26 15:24:49.270 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:24:49 compute-1 nova_compute[183403]: 2026-01-26 15:24:49.271 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:24:49 compute-1 nova_compute[183403]: 2026-01-26 15:24:49.271 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'df132852-8c41-5e01-ae63-8b63f0283b48', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:24:49 compute-1 nova_compute[183403]: 2026-01-26 15:24:49.276 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:24:49 compute-1 nova_compute[183403]: 2026-01-26 15:24:49.294 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:24:49 compute-1 nova_compute[183403]: 2026-01-26 15:24:49.294 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8a736c48-af, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:24:49 compute-1 nova_compute[183403]: 2026-01-26 15:24:49.295 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap8a736c48-af, col_values=(('qos', UUID('17ea71e1-f115-47d5-827c-7ad31eddaabb')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:24:49 compute-1 nova_compute[183403]: 2026-01-26 15:24:49.296 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap8a736c48-af, col_values=(('external_ids', {'iface-id': '8a736c48-af6c-4bb5-89e7-34d3cd2654df', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d7:ea:cd', 'vm-uuid': '21c0fdc0-71e1-403c-a34a-fce881dd6046'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:24:49 compute-1 nova_compute[183403]: 2026-01-26 15:24:49.298 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:24:49 compute-1 NetworkManager[55716]: <info>  [1769441089.2995] manager: (tap8a736c48-af): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/59)
Jan 26 15:24:49 compute-1 nova_compute[183403]: 2026-01-26 15:24:49.301 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:24:49 compute-1 nova_compute[183403]: 2026-01-26 15:24:49.308 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:24:49 compute-1 nova_compute[183403]: 2026-01-26 15:24:49.309 183407 INFO os_vif [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d7:ea:cd,bridge_name='br-int',has_traffic_filtering=True,id=8a736c48-af6c-4bb5-89e7-34d3cd2654df,network=Network(3e9dfa8e-4100-40c2-b5c3-611e27e3b601),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a736c48-af')
Jan 26 15:24:49 compute-1 openstack_network_exporter[195610]: ERROR   15:24:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:24:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:24:49 compute-1 openstack_network_exporter[195610]: ERROR   15:24:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:24:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:24:50 compute-1 nova_compute[183403]: 2026-01-26 15:24:50.724 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:24:51 compute-1 nova_compute[183403]: 2026-01-26 15:24:51.040 183407 DEBUG nova.virt.libvirt.driver [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 15:24:51 compute-1 nova_compute[183403]: 2026-01-26 15:24:51.040 183407 DEBUG nova.virt.libvirt.driver [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 15:24:51 compute-1 nova_compute[183403]: 2026-01-26 15:24:51.041 183407 DEBUG nova.virt.libvirt.driver [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] No VIF found with MAC fa:16:3e:d7:ea:cd, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Jan 26 15:24:51 compute-1 nova_compute[183403]: 2026-01-26 15:24:51.041 183407 INFO nova.virt.libvirt.driver [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Using config drive
Jan 26 15:24:52 compute-1 podman[210513]: 2026-01-26 15:24:52.93668063 +0000 UTC m=+0.091643081 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest)
Jan 26 15:24:53 compute-1 podman[210512]: 2026-01-26 15:24:53.0092984 +0000 UTC m=+0.167565151 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20260120)
Jan 26 15:24:53 compute-1 nova_compute[183403]: 2026-01-26 15:24:53.093 183407 WARNING neutronclient.v2_0.client [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:24:53 compute-1 nova_compute[183403]: 2026-01-26 15:24:53.947 183407 INFO nova.virt.libvirt.driver [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Creating config drive at /var/lib/nova/instances/21c0fdc0-71e1-403c-a34a-fce881dd6046/disk.config
Jan 26 15:24:53 compute-1 nova_compute[183403]: 2026-01-26 15:24:53.956 183407 DEBUG oslo_concurrency.processutils [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/21c0fdc0-71e1-403c-a34a-fce881dd6046/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpcb6culwk execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:24:54 compute-1 nova_compute[183403]: 2026-01-26 15:24:54.105 183407 DEBUG oslo_concurrency.processutils [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/21c0fdc0-71e1-403c-a34a-fce881dd6046/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpcb6culwk" returned: 0 in 0.149s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:24:54 compute-1 kernel: tap8a736c48-af: entered promiscuous mode
Jan 26 15:24:54 compute-1 NetworkManager[55716]: <info>  [1769441094.2098] manager: (tap8a736c48-af): new Tun device (/org/freedesktop/NetworkManager/Devices/60)
Jan 26 15:24:54 compute-1 ovn_controller[95641]: 2026-01-26T15:24:54Z|00153|binding|INFO|Claiming lport 8a736c48-af6c-4bb5-89e7-34d3cd2654df for this chassis.
Jan 26 15:24:54 compute-1 ovn_controller[95641]: 2026-01-26T15:24:54Z|00154|binding|INFO|8a736c48-af6c-4bb5-89e7-34d3cd2654df: Claiming fa:16:3e:d7:ea:cd 10.100.0.5
Jan 26 15:24:54 compute-1 nova_compute[183403]: 2026-01-26 15:24:54.254 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:24:54 compute-1 ovn_controller[95641]: 2026-01-26T15:24:54Z|00155|binding|INFO|Setting lport 8a736c48-af6c-4bb5-89e7-34d3cd2654df ovn-installed in OVS
Jan 26 15:24:54 compute-1 nova_compute[183403]: 2026-01-26 15:24:54.271 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:24:54 compute-1 systemd-udevd[210573]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 15:24:54 compute-1 systemd-machined[154697]: New machine qemu-14-instance-00000013.
Jan 26 15:24:54 compute-1 nova_compute[183403]: 2026-01-26 15:24:54.299 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:24:54 compute-1 NetworkManager[55716]: <info>  [1769441094.3080] device (tap8a736c48-af): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 15:24:54 compute-1 NetworkManager[55716]: <info>  [1769441094.3087] device (tap8a736c48-af): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 15:24:54 compute-1 systemd[1]: Started Virtual Machine qemu-14-instance-00000013.
Jan 26 15:24:54 compute-1 ovn_controller[95641]: 2026-01-26T15:24:54Z|00156|binding|INFO|Setting lport 8a736c48-af6c-4bb5-89e7-34d3cd2654df up in Southbound
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:24:54.512 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d7:ea:cd 10.100.0.5'], port_security=['fa:16:3e:d7:ea:cd 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '21c0fdc0-71e1-403c-a34a-fce881dd6046', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3e9dfa8e-4100-40c2-b5c3-611e27e3b601', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e07345fa9028494086d0d062e5c6d037', 'neutron:revision_number': '4', 'neutron:security_group_ids': '965d4ca4-c12d-4e4f-bc5a-34d61b5f1699', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9d8c1c3b-77a7-4237-87f3-33f734737e5d, chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], logical_port=8a736c48-af6c-4bb5-89e7-34d3cd2654df) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:24:54.514 104930 INFO neutron.agent.ovn.metadata.agent [-] Port 8a736c48-af6c-4bb5-89e7-34d3cd2654df in datapath 3e9dfa8e-4100-40c2-b5c3-611e27e3b601 bound to our chassis
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:24:54.516 104930 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3e9dfa8e-4100-40c2-b5c3-611e27e3b601
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:24:54.539 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[5e50cd81-090f-41cf-95fb-e9850f9e952c]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:24:54.540 104930 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3e9dfa8e-41 in ovnmeta-3e9dfa8e-4100-40c2-b5c3-611e27e3b601 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:24:54.542 203506 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3e9dfa8e-40 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:24:54.542 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[0e255d35-8790-4712-8bb8-f142036db6fb]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:24:54.543 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[a98de137-cb8c-4510-8dce-622a2fd2c271]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:24:54.570 105448 DEBUG oslo.privsep.daemon [-] privsep: reply[4ddbe29a-9c6c-4e1b-ba13-d456029f91f8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:24:54.590 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[be7a7227-ecda-491f-81e8-6e91b98870f5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:24:54.631 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[86d28e62-a401-474f-ada3-da24d62ca9ab]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:24:54 compute-1 NetworkManager[55716]: <info>  [1769441094.6402] manager: (tap3e9dfa8e-40): new Veth device (/org/freedesktop/NetworkManager/Devices/61)
Jan 26 15:24:54 compute-1 systemd-udevd[210575]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:24:54.641 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[c3404994-dbd0-4b57-a08d-3951732ccf6f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:24:54.692 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[e5e60bcb-14ef-4c4c-9608-610633acb357]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:24:54.697 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[8dd82cbf-540f-4a7a-8654-64f615bd6593]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:24:54 compute-1 NetworkManager[55716]: <info>  [1769441094.7305] device (tap3e9dfa8e-40): carrier: link connected
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:24:54.744 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[6321d29a-2173-45d4-9f14-358730e6edd3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:24:54.770 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[adcfd3fc-05e5-4c35-973a-533ae31529ee]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3e9dfa8e-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:63:8d:23'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 481733, 'reachable_time': 34774, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210606, 'error': None, 'target': 'ovnmeta-3e9dfa8e-4100-40c2-b5c3-611e27e3b601', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:24:54.790 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[a7f7bb98-9c28-42fd-9723-f90776f68b5a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe63:8d23'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 481733, 'tstamp': 481733}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210613, 'error': None, 'target': 'ovnmeta-3e9dfa8e-4100-40c2-b5c3-611e27e3b601', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:24:54.812 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[86553b6a-866b-4788-a3ca-1728cee28b9e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3e9dfa8e-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:63:8d:23'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 481733, 'reachable_time': 34774, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 210614, 'error': None, 'target': 'ovnmeta-3e9dfa8e-4100-40c2-b5c3-611e27e3b601', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:24:54.861 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[6d36b141-258d-414f-a7d6-9a5ff0a3a447]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:24:54.955 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[2a6f30e4-6448-4683-929b-d6d273977a5f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:24:54.956 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3e9dfa8e-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:24:54.957 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:24:54.958 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3e9dfa8e-40, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:24:54 compute-1 NetworkManager[55716]: <info>  [1769441094.9617] manager: (tap3e9dfa8e-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/62)
Jan 26 15:24:54 compute-1 nova_compute[183403]: 2026-01-26 15:24:54.960 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:24:54 compute-1 kernel: tap3e9dfa8e-40: entered promiscuous mode
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:24:54.965 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3e9dfa8e-40, col_values=(('external_ids', {'iface-id': '46bf4c6f-a210-4e32-9067-075b592fdafa'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:24:54 compute-1 ovn_controller[95641]: 2026-01-26T15:24:54Z|00157|binding|INFO|Releasing lport 46bf4c6f-a210-4e32-9067-075b592fdafa from this chassis (sb_readonly=1)
Jan 26 15:24:54 compute-1 nova_compute[183403]: 2026-01-26 15:24:54.966 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:24:54 compute-1 nova_compute[183403]: 2026-01-26 15:24:54.988 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:24:54.990 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[76618660-2943-487c-b4ea-022718dc12e5]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:24:54.991 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3e9dfa8e-4100-40c2-b5c3-611e27e3b601.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3e9dfa8e-4100-40c2-b5c3-611e27e3b601.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:24:54.991 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3e9dfa8e-4100-40c2-b5c3-611e27e3b601.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3e9dfa8e-4100-40c2-b5c3-611e27e3b601.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:24:54.991 104930 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 3e9dfa8e-4100-40c2-b5c3-611e27e3b601 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:24:54.992 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3e9dfa8e-4100-40c2-b5c3-611e27e3b601.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3e9dfa8e-4100-40c2-b5c3-611e27e3b601.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:24:54.992 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[d2bd2227-d80b-4b1b-992f-9aaec28cd994]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:24:54.993 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3e9dfa8e-4100-40c2-b5c3-611e27e3b601.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3e9dfa8e-4100-40c2-b5c3-611e27e3b601.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:24:54.993 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[ff7a8106-38e8-43bb-944f-f2a29cc95570]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:24:54.994 104930 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]: global
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]:     log         /dev/log local0 debug
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]:     log-tag     haproxy-metadata-proxy-3e9dfa8e-4100-40c2-b5c3-611e27e3b601
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]:     user        root
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]:     group       root
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]:     maxconn     1024
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]:     pidfile     /var/lib/neutron/external/pids/3e9dfa8e-4100-40c2-b5c3-611e27e3b601.pid.haproxy
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]:     daemon
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]: 
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]: defaults
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]:     log global
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]:     mode http
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]:     option httplog
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]:     option dontlognull
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]:     option http-server-close
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]:     option forwardfor
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]:     retries                 3
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]:     timeout http-request    30s
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]:     timeout connect         30s
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]:     timeout client          32s
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]:     timeout server          32s
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]:     timeout http-keep-alive 30s
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]: 
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]: listen listener
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]:     bind 169.254.169.254:80
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]:     
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]:     server metadata /var/lib/neutron/metadata_proxy
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]: 
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]:     http-request add-header X-OVN-Network-ID 3e9dfa8e-4100-40c2-b5c3-611e27e3b601
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Jan 26 15:24:54 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:24:54.995 104930 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3e9dfa8e-4100-40c2-b5c3-611e27e3b601', 'env', 'PROCESS_TAG=haproxy-3e9dfa8e-4100-40c2-b5c3-611e27e3b601', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3e9dfa8e-4100-40c2-b5c3-611e27e3b601.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Jan 26 15:24:55 compute-1 podman[210647]: 2026-01-26 15:24:55.453188544 +0000 UTC m=+0.077851921 container create 21241bbcd866399780d30fef6fa5e96d88b0373652cda1657c79bf8919814791 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-3e9dfa8e-4100-40c2-b5c3-611e27e3b601, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2)
Jan 26 15:24:55 compute-1 podman[210647]: 2026-01-26 15:24:55.411486335 +0000 UTC m=+0.036149752 image pull d5bf96c5225682608353c2a38183b39c74c7c48343b54a579b3b6f3d81996637 38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Jan 26 15:24:55 compute-1 systemd[1]: Started libpod-conmon-21241bbcd866399780d30fef6fa5e96d88b0373652cda1657c79bf8919814791.scope.
Jan 26 15:24:55 compute-1 systemd[1]: Started libcrun container.
Jan 26 15:24:55 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b5ed0b84901436d27fa66af3db713ebee895466497b91b60aa2f9de0ef61c4b7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 15:24:55 compute-1 podman[210647]: 2026-01-26 15:24:55.569683379 +0000 UTC m=+0.194346806 container init 21241bbcd866399780d30fef6fa5e96d88b0373652cda1657c79bf8919814791 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-3e9dfa8e-4100-40c2-b5c3-611e27e3b601, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 26 15:24:55 compute-1 podman[210647]: 2026-01-26 15:24:55.581649154 +0000 UTC m=+0.206312521 container start 21241bbcd866399780d30fef6fa5e96d88b0373652cda1657c79bf8919814791 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-3e9dfa8e-4100-40c2-b5c3-611e27e3b601, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260120)
Jan 26 15:24:55 compute-1 neutron-haproxy-ovnmeta-3e9dfa8e-4100-40c2-b5c3-611e27e3b601[210662]: [NOTICE]   (210666) : New worker (210668) forked
Jan 26 15:24:55 compute-1 neutron-haproxy-ovnmeta-3e9dfa8e-4100-40c2-b5c3-611e27e3b601[210662]: [NOTICE]   (210666) : Loading success.
Jan 26 15:24:55 compute-1 nova_compute[183403]: 2026-01-26 15:24:55.726 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:24:56 compute-1 nova_compute[183403]: 2026-01-26 15:24:56.121 183407 DEBUG nova.compute.manager [req-1d75bd1a-4aeb-4d34-94a8-be151fd4577e req-6ff87859-a9e7-486f-9e29-94581327dfa5 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Received event network-vif-plugged-8a736c48-af6c-4bb5-89e7-34d3cd2654df external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:24:56 compute-1 nova_compute[183403]: 2026-01-26 15:24:56.122 183407 DEBUG oslo_concurrency.lockutils [req-1d75bd1a-4aeb-4d34-94a8-be151fd4577e req-6ff87859-a9e7-486f-9e29-94581327dfa5 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "21c0fdc0-71e1-403c-a34a-fce881dd6046-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:24:56 compute-1 nova_compute[183403]: 2026-01-26 15:24:56.122 183407 DEBUG oslo_concurrency.lockutils [req-1d75bd1a-4aeb-4d34-94a8-be151fd4577e req-6ff87859-a9e7-486f-9e29-94581327dfa5 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "21c0fdc0-71e1-403c-a34a-fce881dd6046-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:24:56 compute-1 nova_compute[183403]: 2026-01-26 15:24:56.123 183407 DEBUG oslo_concurrency.lockutils [req-1d75bd1a-4aeb-4d34-94a8-be151fd4577e req-6ff87859-a9e7-486f-9e29-94581327dfa5 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "21c0fdc0-71e1-403c-a34a-fce881dd6046-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:24:56 compute-1 nova_compute[183403]: 2026-01-26 15:24:56.123 183407 DEBUG nova.compute.manager [req-1d75bd1a-4aeb-4d34-94a8-be151fd4577e req-6ff87859-a9e7-486f-9e29-94581327dfa5 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Processing event network-vif-plugged-8a736c48-af6c-4bb5-89e7-34d3cd2654df _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 26 15:24:56 compute-1 nova_compute[183403]: 2026-01-26 15:24:56.124 183407 DEBUG nova.compute.manager [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 26 15:24:56 compute-1 nova_compute[183403]: 2026-01-26 15:24:56.131 183407 DEBUG nova.virt.libvirt.driver [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Jan 26 15:24:56 compute-1 nova_compute[183403]: 2026-01-26 15:24:56.137 183407 INFO nova.virt.libvirt.driver [-] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Instance spawned successfully.
Jan 26 15:24:56 compute-1 nova_compute[183403]: 2026-01-26 15:24:56.137 183407 DEBUG nova.virt.libvirt.driver [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Jan 26 15:24:56 compute-1 nova_compute[183403]: 2026-01-26 15:24:56.665 183407 DEBUG nova.virt.libvirt.driver [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:24:56 compute-1 nova_compute[183403]: 2026-01-26 15:24:56.666 183407 DEBUG nova.virt.libvirt.driver [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:24:56 compute-1 nova_compute[183403]: 2026-01-26 15:24:56.667 183407 DEBUG nova.virt.libvirt.driver [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:24:56 compute-1 nova_compute[183403]: 2026-01-26 15:24:56.667 183407 DEBUG nova.virt.libvirt.driver [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:24:56 compute-1 nova_compute[183403]: 2026-01-26 15:24:56.668 183407 DEBUG nova.virt.libvirt.driver [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:24:56 compute-1 nova_compute[183403]: 2026-01-26 15:24:56.669 183407 DEBUG nova.virt.libvirt.driver [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:24:57 compute-1 nova_compute[183403]: 2026-01-26 15:24:57.263 183407 INFO nova.compute.manager [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Took 15.27 seconds to spawn the instance on the hypervisor.
Jan 26 15:24:57 compute-1 nova_compute[183403]: 2026-01-26 15:24:57.263 183407 DEBUG nova.compute.manager [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Jan 26 15:24:57 compute-1 nova_compute[183403]: 2026-01-26 15:24:57.926 183407 INFO nova.compute.manager [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Took 21.32 seconds to build instance.
Jan 26 15:24:58 compute-1 nova_compute[183403]: 2026-01-26 15:24:58.227 183407 DEBUG nova.compute.manager [req-5057c4e9-31de-4dc2-b602-38d57d670d7e req-b5541595-7552-4dda-99b9-e033282cc3a7 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Received event network-vif-plugged-8a736c48-af6c-4bb5-89e7-34d3cd2654df external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:24:58 compute-1 nova_compute[183403]: 2026-01-26 15:24:58.227 183407 DEBUG oslo_concurrency.lockutils [req-5057c4e9-31de-4dc2-b602-38d57d670d7e req-b5541595-7552-4dda-99b9-e033282cc3a7 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "21c0fdc0-71e1-403c-a34a-fce881dd6046-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:24:58 compute-1 nova_compute[183403]: 2026-01-26 15:24:58.229 183407 DEBUG oslo_concurrency.lockutils [req-5057c4e9-31de-4dc2-b602-38d57d670d7e req-b5541595-7552-4dda-99b9-e033282cc3a7 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "21c0fdc0-71e1-403c-a34a-fce881dd6046-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:24:58 compute-1 nova_compute[183403]: 2026-01-26 15:24:58.229 183407 DEBUG oslo_concurrency.lockutils [req-5057c4e9-31de-4dc2-b602-38d57d670d7e req-b5541595-7552-4dda-99b9-e033282cc3a7 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "21c0fdc0-71e1-403c-a34a-fce881dd6046-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:24:58 compute-1 nova_compute[183403]: 2026-01-26 15:24:58.230 183407 DEBUG nova.compute.manager [req-5057c4e9-31de-4dc2-b602-38d57d670d7e req-b5541595-7552-4dda-99b9-e033282cc3a7 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] No waiting events found dispatching network-vif-plugged-8a736c48-af6c-4bb5-89e7-34d3cd2654df pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:24:58 compute-1 nova_compute[183403]: 2026-01-26 15:24:58.230 183407 WARNING nova.compute.manager [req-5057c4e9-31de-4dc2-b602-38d57d670d7e req-b5541595-7552-4dda-99b9-e033282cc3a7 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Received unexpected event network-vif-plugged-8a736c48-af6c-4bb5-89e7-34d3cd2654df for instance with vm_state active and task_state None.
Jan 26 15:24:58 compute-1 nova_compute[183403]: 2026-01-26 15:24:58.433 183407 DEBUG oslo_concurrency.lockutils [None req-d09ac4ed-d428-409e-af8e-4d22108f8e59 0c77b3ed882642e3b0c7840dc8efc49a e07345fa9028494086d0d062e5c6d037 - - default default] Lock "21c0fdc0-71e1-403c-a34a-fce881dd6046" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 22.853s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:24:59 compute-1 nova_compute[183403]: 2026-01-26 15:24:59.302 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:25:00 compute-1 nova_compute[183403]: 2026-01-26 15:25:00.728 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:25:04 compute-1 nova_compute[183403]: 2026-01-26 15:25:04.304 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:25:05 compute-1 podman[192725]: time="2026-01-26T15:25:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:25:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:25:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16574 "" "Go-http-client/1.1"
Jan 26 15:25:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:25:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2652 "" "Go-http-client/1.1"
Jan 26 15:25:05 compute-1 nova_compute[183403]: 2026-01-26 15:25:05.731 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:25:07 compute-1 ovn_controller[95641]: 2026-01-26T15:25:07Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d7:ea:cd 10.100.0.5
Jan 26 15:25:07 compute-1 ovn_controller[95641]: 2026-01-26T15:25:07Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d7:ea:cd 10.100.0.5
Jan 26 15:25:09 compute-1 nova_compute[183403]: 2026-01-26 15:25:09.308 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:25:10 compute-1 nova_compute[183403]: 2026-01-26 15:25:10.733 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:25:11 compute-1 nova_compute[183403]: 2026-01-26 15:25:11.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:25:13 compute-1 nova_compute[183403]: 2026-01-26 15:25:13.577 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:25:14 compute-1 nova_compute[183403]: 2026-01-26 15:25:14.311 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:25:14 compute-1 podman[210690]: 2026-01-26 15:25:14.923912874 +0000 UTC m=+0.082497492 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 15:25:14 compute-1 podman[210691]: 2026-01-26 15:25:14.93234257 +0000 UTC m=+0.089761392 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, version=9.6, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, container_name=openstack_network_exporter, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., config_id=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible)
Jan 26 15:25:15 compute-1 nova_compute[183403]: 2026-01-26 15:25:15.736 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:25:17 compute-1 nova_compute[183403]: 2026-01-26 15:25:17.098 183407 DEBUG nova.virt.libvirt.driver [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Check if temp file /var/lib/nova/instances/tmpb9a9dirt exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Jan 26 15:25:17 compute-1 nova_compute[183403]: 2026-01-26 15:25:17.105 183407 DEBUG nova.compute.manager [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpb9a9dirt',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='21c0fdc0-71e1-403c-a34a-fce881dd6046',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9298
Jan 26 15:25:17 compute-1 nova_compute[183403]: 2026-01-26 15:25:17.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:25:18 compute-1 nova_compute[183403]: 2026-01-26 15:25:18.092 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:25:18 compute-1 nova_compute[183403]: 2026-01-26 15:25:18.092 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:25:18 compute-1 nova_compute[183403]: 2026-01-26 15:25:18.093 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:25:18 compute-1 nova_compute[183403]: 2026-01-26 15:25:18.093 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:25:19 compute-1 nova_compute[183403]: 2026-01-26 15:25:19.158 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/21c0fdc0-71e1-403c-a34a-fce881dd6046/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:25:19 compute-1 nova_compute[183403]: 2026-01-26 15:25:19.233 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/21c0fdc0-71e1-403c-a34a-fce881dd6046/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:25:19 compute-1 nova_compute[183403]: 2026-01-26 15:25:19.235 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/21c0fdc0-71e1-403c-a34a-fce881dd6046/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:25:19 compute-1 nova_compute[183403]: 2026-01-26 15:25:19.314 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:25:19 compute-1 nova_compute[183403]: 2026-01-26 15:25:19.328 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/21c0fdc0-71e1-403c-a34a-fce881dd6046/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:25:19 compute-1 openstack_network_exporter[195610]: ERROR   15:25:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:25:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:25:19 compute-1 openstack_network_exporter[195610]: ERROR   15:25:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:25:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:25:19 compute-1 nova_compute[183403]: 2026-01-26 15:25:19.541 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:25:19 compute-1 nova_compute[183403]: 2026-01-26 15:25:19.543 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:25:19 compute-1 nova_compute[183403]: 2026-01-26 15:25:19.583 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.040s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:25:19 compute-1 nova_compute[183403]: 2026-01-26 15:25:19.584 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5662MB free_disk=73.11528015136719GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:25:19 compute-1 nova_compute[183403]: 2026-01-26 15:25:19.585 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:25:19 compute-1 nova_compute[183403]: 2026-01-26 15:25:19.585 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:25:20 compute-1 nova_compute[183403]: 2026-01-26 15:25:20.605 183407 INFO nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Updating resource usage from migration 44b63b5a-98fd-48ba-a9c6-c9cd639f47bb
Jan 26 15:25:20 compute-1 nova_compute[183403]: 2026-01-26 15:25:20.636 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Migration 44b63b5a-98fd-48ba-a9c6-c9cd639f47bb is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Jan 26 15:25:20 compute-1 nova_compute[183403]: 2026-01-26 15:25:20.637 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:25:20 compute-1 nova_compute[183403]: 2026-01-26 15:25:20.637 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:25:19 up  1:20,  0 user,  load average: 0.29, 0.27, 0.29\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_migrating': '1', 'num_os_type_None': '1', 'num_proj_e07345fa9028494086d0d062e5c6d037': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:25:20 compute-1 nova_compute[183403]: 2026-01-26 15:25:20.679 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:25:20 compute-1 nova_compute[183403]: 2026-01-26 15:25:20.738 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:25:21 compute-1 nova_compute[183403]: 2026-01-26 15:25:21.186 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:25:21 compute-1 nova_compute[183403]: 2026-01-26 15:25:21.497 183407 DEBUG oslo_concurrency.processutils [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/21c0fdc0-71e1-403c-a34a-fce881dd6046/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:25:21 compute-1 nova_compute[183403]: 2026-01-26 15:25:21.596 183407 DEBUG oslo_concurrency.processutils [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/21c0fdc0-71e1-403c-a34a-fce881dd6046/disk --force-share --output=json" returned: 0 in 0.099s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:25:21 compute-1 nova_compute[183403]: 2026-01-26 15:25:21.597 183407 DEBUG oslo_concurrency.processutils [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/21c0fdc0-71e1-403c-a34a-fce881dd6046/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:25:21 compute-1 nova_compute[183403]: 2026-01-26 15:25:21.652 183407 DEBUG oslo_concurrency.processutils [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/21c0fdc0-71e1-403c-a34a-fce881dd6046/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:25:21 compute-1 nova_compute[183403]: 2026-01-26 15:25:21.654 183407 DEBUG nova.compute.manager [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Preparing to wait for external event network-vif-plugged-8a736c48-af6c-4bb5-89e7-34d3cd2654df prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 26 15:25:21 compute-1 nova_compute[183403]: 2026-01-26 15:25:21.654 183407 DEBUG oslo_concurrency.lockutils [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "21c0fdc0-71e1-403c-a34a-fce881dd6046-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:25:21 compute-1 nova_compute[183403]: 2026-01-26 15:25:21.655 183407 DEBUG oslo_concurrency.lockutils [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "21c0fdc0-71e1-403c-a34a-fce881dd6046-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:25:21 compute-1 nova_compute[183403]: 2026-01-26 15:25:21.655 183407 DEBUG oslo_concurrency.lockutils [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "21c0fdc0-71e1-403c-a34a-fce881dd6046-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:25:21 compute-1 nova_compute[183403]: 2026-01-26 15:25:21.697 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:25:21 compute-1 nova_compute[183403]: 2026-01-26 15:25:21.698 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.113s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:25:23 compute-1 nova_compute[183403]: 2026-01-26 15:25:23.698 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:25:23 compute-1 nova_compute[183403]: 2026-01-26 15:25:23.699 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:25:23 compute-1 nova_compute[183403]: 2026-01-26 15:25:23.700 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:25:23 compute-1 nova_compute[183403]: 2026-01-26 15:25:23.700 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:25:23 compute-1 nova_compute[183403]: 2026-01-26 15:25:23.701 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:25:23 compute-1 nova_compute[183403]: 2026-01-26 15:25:23.701 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 15:25:23 compute-1 podman[210753]: 2026-01-26 15:25:23.937007808 +0000 UTC m=+0.095007272 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260120, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 26 15:25:23 compute-1 podman[210752]: 2026-01-26 15:25:23.988210467 +0000 UTC m=+0.150016097 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=watcher_latest, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 26 15:25:24 compute-1 nova_compute[183403]: 2026-01-26 15:25:24.318 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:25:25 compute-1 ovn_controller[95641]: 2026-01-26T15:25:25Z|00158|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 26 15:25:25 compute-1 nova_compute[183403]: 2026-01-26 15:25:25.741 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:25:27 compute-1 nova_compute[183403]: 2026-01-26 15:25:27.681 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:25:27 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:25:27.681 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:ac:38', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'de:96:40:90:f3:e5'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:25:27 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:25:27.683 104930 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 15:25:27 compute-1 nova_compute[183403]: 2026-01-26 15:25:27.731 183407 DEBUG nova.compute.manager [req-1bea2c10-28a0-485c-96ec-9fdca44005d8 req-50e98dd4-6d7f-4420-84fa-354307ab816f 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Received event network-vif-unplugged-8a736c48-af6c-4bb5-89e7-34d3cd2654df external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:25:27 compute-1 nova_compute[183403]: 2026-01-26 15:25:27.732 183407 DEBUG oslo_concurrency.lockutils [req-1bea2c10-28a0-485c-96ec-9fdca44005d8 req-50e98dd4-6d7f-4420-84fa-354307ab816f 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "21c0fdc0-71e1-403c-a34a-fce881dd6046-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:25:27 compute-1 nova_compute[183403]: 2026-01-26 15:25:27.732 183407 DEBUG oslo_concurrency.lockutils [req-1bea2c10-28a0-485c-96ec-9fdca44005d8 req-50e98dd4-6d7f-4420-84fa-354307ab816f 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "21c0fdc0-71e1-403c-a34a-fce881dd6046-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:25:27 compute-1 nova_compute[183403]: 2026-01-26 15:25:27.732 183407 DEBUG oslo_concurrency.lockutils [req-1bea2c10-28a0-485c-96ec-9fdca44005d8 req-50e98dd4-6d7f-4420-84fa-354307ab816f 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "21c0fdc0-71e1-403c-a34a-fce881dd6046-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:25:27 compute-1 nova_compute[183403]: 2026-01-26 15:25:27.733 183407 DEBUG nova.compute.manager [req-1bea2c10-28a0-485c-96ec-9fdca44005d8 req-50e98dd4-6d7f-4420-84fa-354307ab816f 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] No event matching network-vif-unplugged-8a736c48-af6c-4bb5-89e7-34d3cd2654df in dict_keys([('network-vif-plugged', '8a736c48-af6c-4bb5-89e7-34d3cd2654df')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:350
Jan 26 15:25:27 compute-1 nova_compute[183403]: 2026-01-26 15:25:27.733 183407 DEBUG nova.compute.manager [req-1bea2c10-28a0-485c-96ec-9fdca44005d8 req-50e98dd4-6d7f-4420-84fa-354307ab816f 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Received event network-vif-unplugged-8a736c48-af6c-4bb5-89e7-34d3cd2654df for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 15:25:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:25:29.070 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:25:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:25:29.071 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:25:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:25:29.071 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:25:29 compute-1 nova_compute[183403]: 2026-01-26 15:25:29.183 183407 INFO nova.compute.manager [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Took 7.53 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.
Jan 26 15:25:29 compute-1 nova_compute[183403]: 2026-01-26 15:25:29.320 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:25:29 compute-1 nova_compute[183403]: 2026-01-26 15:25:29.794 183407 DEBUG nova.compute.manager [req-7612b1ed-6865-4bc3-b42c-b566268a6c2e req-54ba6f74-6662-4390-b2b0-b92b8c1db4d0 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Received event network-vif-plugged-8a736c48-af6c-4bb5-89e7-34d3cd2654df external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:25:29 compute-1 nova_compute[183403]: 2026-01-26 15:25:29.794 183407 DEBUG oslo_concurrency.lockutils [req-7612b1ed-6865-4bc3-b42c-b566268a6c2e req-54ba6f74-6662-4390-b2b0-b92b8c1db4d0 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "21c0fdc0-71e1-403c-a34a-fce881dd6046-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:25:29 compute-1 nova_compute[183403]: 2026-01-26 15:25:29.794 183407 DEBUG oslo_concurrency.lockutils [req-7612b1ed-6865-4bc3-b42c-b566268a6c2e req-54ba6f74-6662-4390-b2b0-b92b8c1db4d0 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "21c0fdc0-71e1-403c-a34a-fce881dd6046-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:25:29 compute-1 nova_compute[183403]: 2026-01-26 15:25:29.795 183407 DEBUG oslo_concurrency.lockutils [req-7612b1ed-6865-4bc3-b42c-b566268a6c2e req-54ba6f74-6662-4390-b2b0-b92b8c1db4d0 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "21c0fdc0-71e1-403c-a34a-fce881dd6046-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:25:29 compute-1 nova_compute[183403]: 2026-01-26 15:25:29.795 183407 DEBUG nova.compute.manager [req-7612b1ed-6865-4bc3-b42c-b566268a6c2e req-54ba6f74-6662-4390-b2b0-b92b8c1db4d0 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Processing event network-vif-plugged-8a736c48-af6c-4bb5-89e7-34d3cd2654df _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 26 15:25:29 compute-1 nova_compute[183403]: 2026-01-26 15:25:29.795 183407 DEBUG nova.compute.manager [req-7612b1ed-6865-4bc3-b42c-b566268a6c2e req-54ba6f74-6662-4390-b2b0-b92b8c1db4d0 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Received event network-changed-8a736c48-af6c-4bb5-89e7-34d3cd2654df external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:25:29 compute-1 nova_compute[183403]: 2026-01-26 15:25:29.796 183407 DEBUG nova.compute.manager [req-7612b1ed-6865-4bc3-b42c-b566268a6c2e req-54ba6f74-6662-4390-b2b0-b92b8c1db4d0 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Refreshing instance network info cache due to event network-changed-8a736c48-af6c-4bb5-89e7-34d3cd2654df. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 26 15:25:29 compute-1 nova_compute[183403]: 2026-01-26 15:25:29.796 183407 DEBUG oslo_concurrency.lockutils [req-7612b1ed-6865-4bc3-b42c-b566268a6c2e req-54ba6f74-6662-4390-b2b0-b92b8c1db4d0 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "refresh_cache-21c0fdc0-71e1-403c-a34a-fce881dd6046" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 15:25:29 compute-1 nova_compute[183403]: 2026-01-26 15:25:29.796 183407 DEBUG oslo_concurrency.lockutils [req-7612b1ed-6865-4bc3-b42c-b566268a6c2e req-54ba6f74-6662-4390-b2b0-b92b8c1db4d0 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquired lock "refresh_cache-21c0fdc0-71e1-403c-a34a-fce881dd6046" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 15:25:29 compute-1 nova_compute[183403]: 2026-01-26 15:25:29.796 183407 DEBUG nova.network.neutron [req-7612b1ed-6865-4bc3-b42c-b566268a6c2e req-54ba6f74-6662-4390-b2b0-b92b8c1db4d0 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Refreshing network info cache for port 8a736c48-af6c-4bb5-89e7-34d3cd2654df _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 26 15:25:29 compute-1 nova_compute[183403]: 2026-01-26 15:25:29.798 183407 DEBUG nova.compute.manager [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 26 15:25:30 compute-1 nova_compute[183403]: 2026-01-26 15:25:30.304 183407 WARNING neutronclient.v2_0.client [req-7612b1ed-6865-4bc3-b42c-b566268a6c2e req-54ba6f74-6662-4390-b2b0-b92b8c1db4d0 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:25:30 compute-1 nova_compute[183403]: 2026-01-26 15:25:30.311 183407 DEBUG nova.compute.manager [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpb9a9dirt',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='21c0fdc0-71e1-403c-a34a-fce881dd6046',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(44b63b5a-98fd-48ba-a9c6-c9cd639f47bb),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9663
Jan 26 15:25:30 compute-1 nova_compute[183403]: 2026-01-26 15:25:30.743 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:25:30 compute-1 nova_compute[183403]: 2026-01-26 15:25:30.825 183407 DEBUG nova.objects.instance [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lazy-loading 'migration_context' on Instance uuid 21c0fdc0-71e1-403c-a34a-fce881dd6046 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:25:30 compute-1 nova_compute[183403]: 2026-01-26 15:25:30.827 183407 DEBUG nova.virt.libvirt.driver [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Jan 26 15:25:30 compute-1 nova_compute[183403]: 2026-01-26 15:25:30.830 183407 DEBUG nova.virt.libvirt.driver [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Jan 26 15:25:30 compute-1 nova_compute[183403]: 2026-01-26 15:25:30.830 183407 DEBUG nova.virt.libvirt.driver [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Jan 26 15:25:31 compute-1 nova_compute[183403]: 2026-01-26 15:25:31.216 183407 WARNING neutronclient.v2_0.client [req-7612b1ed-6865-4bc3-b42c-b566268a6c2e req-54ba6f74-6662-4390-b2b0-b92b8c1db4d0 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:25:31 compute-1 nova_compute[183403]: 2026-01-26 15:25:31.335 183407 DEBUG nova.virt.libvirt.driver [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Jan 26 15:25:31 compute-1 nova_compute[183403]: 2026-01-26 15:25:31.335 183407 DEBUG nova.virt.libvirt.driver [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Jan 26 15:25:31 compute-1 nova_compute[183403]: 2026-01-26 15:25:31.340 183407 DEBUG nova.virt.libvirt.vif [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2026-01-26T15:24:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-2057709501',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-205',id=19,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T15:24:57Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e07345fa9028494086d0d062e5c6d037',ramdisk_id='',reservation_id='r-e0xp661h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1702173696',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1702173696-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T15:24:57Z,user_data=None,user_id='0c77b3ed882642e3b0c7840dc8efc49a',uuid=21c0fdc0-71e1-403c-a34a-fce881dd6046,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8a736c48-af6c-4bb5-89e7-34d3cd2654df", "address": "fa:16:3e:d7:ea:cd", "network": {"id": "3e9dfa8e-4100-40c2-b5c3-611e27e3b601", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-854362176-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9147aacc62b34487b1727939f1a92703", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap8a736c48-af", "ovs_interfaceid": "8a736c48-af6c-4bb5-89e7-34d3cd2654df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 26 15:25:31 compute-1 nova_compute[183403]: 2026-01-26 15:25:31.341 183407 DEBUG nova.network.os_vif_util [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Converting VIF {"id": "8a736c48-af6c-4bb5-89e7-34d3cd2654df", "address": "fa:16:3e:d7:ea:cd", "network": {"id": "3e9dfa8e-4100-40c2-b5c3-611e27e3b601", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-854362176-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9147aacc62b34487b1727939f1a92703", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap8a736c48-af", "ovs_interfaceid": "8a736c48-af6c-4bb5-89e7-34d3cd2654df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:25:31 compute-1 nova_compute[183403]: 2026-01-26 15:25:31.341 183407 DEBUG nova.network.os_vif_util [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d7:ea:cd,bridge_name='br-int',has_traffic_filtering=True,id=8a736c48-af6c-4bb5-89e7-34d3cd2654df,network=Network(3e9dfa8e-4100-40c2-b5c3-611e27e3b601),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a736c48-af') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:25:31 compute-1 nova_compute[183403]: 2026-01-26 15:25:31.342 183407 DEBUG nova.virt.libvirt.migration [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Updating guest XML with vif config: <interface type="ethernet">
Jan 26 15:25:31 compute-1 nova_compute[183403]:   <mac address="fa:16:3e:d7:ea:cd"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   <model type="virtio"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   <driver name="vhost" rx_queue_size="512"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   <mtu size="1442"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   <target dev="tap8a736c48-af"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]: </interface>
Jan 26 15:25:31 compute-1 nova_compute[183403]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Jan 26 15:25:31 compute-1 nova_compute[183403]: 2026-01-26 15:25:31.343 183407 DEBUG nova.virt.libvirt.migration [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Jan 26 15:25:31 compute-1 nova_compute[183403]:   <name>instance-00000013</name>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   <uuid>21c0fdc0-71e1-403c-a34a-fce881dd6046</uuid>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   <metadata>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <nova:name>tempest-TestExecuteNodeResourceConsolidationStrategy-server-2057709501</nova:name>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <nova:creationTime>2026-01-26 15:24:48</nova:creationTime>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <nova:flavor name="m1.nano" id="74480e15-23e6-4569-8ef9-3ddf5ac8b981">
Jan 26 15:25:31 compute-1 nova_compute[183403]:         <nova:memory>128</nova:memory>
Jan 26 15:25:31 compute-1 nova_compute[183403]:         <nova:disk>1</nova:disk>
Jan 26 15:25:31 compute-1 nova_compute[183403]:         <nova:swap>0</nova:swap>
Jan 26 15:25:31 compute-1 nova_compute[183403]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 15:25:31 compute-1 nova_compute[183403]:         <nova:vcpus>1</nova:vcpus>
Jan 26 15:25:31 compute-1 nova_compute[183403]:         <nova:extraSpecs>
Jan 26 15:25:31 compute-1 nova_compute[183403]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 15:25:31 compute-1 nova_compute[183403]:         </nova:extraSpecs>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       </nova:flavor>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <nova:image uuid="354e4d0e-4287-404f-93d3-2c85cfe92fbc">
Jan 26 15:25:31 compute-1 nova_compute[183403]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 15:25:31 compute-1 nova_compute[183403]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 15:25:31 compute-1 nova_compute[183403]:         <nova:minDisk>1</nova:minDisk>
Jan 26 15:25:31 compute-1 nova_compute[183403]:         <nova:minRam>0</nova:minRam>
Jan 26 15:25:31 compute-1 nova_compute[183403]:         <nova:properties>
Jan 26 15:25:31 compute-1 nova_compute[183403]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 15:25:31 compute-1 nova_compute[183403]:         </nova:properties>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       </nova:image>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <nova:owner>
Jan 26 15:25:31 compute-1 nova_compute[183403]:         <nova:user uuid="0c77b3ed882642e3b0c7840dc8efc49a">tempest-TestExecuteNodeResourceConsolidationStrategy-1702173696-project-admin</nova:user>
Jan 26 15:25:31 compute-1 nova_compute[183403]:         <nova:project uuid="e07345fa9028494086d0d062e5c6d037">tempest-TestExecuteNodeResourceConsolidationStrategy-1702173696</nova:project>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       </nova:owner>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <nova:root type="image" uuid="354e4d0e-4287-404f-93d3-2c85cfe92fbc"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <nova:ports>
Jan 26 15:25:31 compute-1 nova_compute[183403]:         <nova:port uuid="8a736c48-af6c-4bb5-89e7-34d3cd2654df">
Jan 26 15:25:31 compute-1 nova_compute[183403]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:         </nova:port>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       </nova:ports>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </nova:instance>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   </metadata>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   <memory unit="KiB">131072</memory>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   <currentMemory unit="KiB">131072</currentMemory>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   <vcpu placement="static">1</vcpu>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   <resource>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <partition>/machine</partition>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   </resource>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   <sysinfo type="smbios">
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <system>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <entry name="manufacturer">RDO</entry>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <entry name="product">OpenStack Compute</entry>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <entry name="serial">21c0fdc0-71e1-403c-a34a-fce881dd6046</entry>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <entry name="uuid">21c0fdc0-71e1-403c-a34a-fce881dd6046</entry>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <entry name="family">Virtual Machine</entry>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </system>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   </sysinfo>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   <os>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <boot dev="hd"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <smbios mode="sysinfo"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   </os>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   <features>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <acpi/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <apic/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <vmcoreinfo state="on"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   </features>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   <cpu mode="custom" match="exact" check="partial">
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <model fallback="allow">Nehalem</model>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   </cpu>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   <clock offset="utc">
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <timer name="hpet" present="no"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   </clock>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   <on_poweroff>destroy</on_poweroff>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   <on_reboot>restart</on_reboot>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   <on_crash>destroy</on_crash>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   <devices>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <disk type="file" device="disk">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <source file="/var/lib/nova/instances/21c0fdc0-71e1-403c-a34a-fce881dd6046/disk"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target dev="vda" bus="virtio"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </disk>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <disk type="file" device="cdrom">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <source file="/var/lib/nova/instances/21c0fdc0-71e1-403c-a34a-fce881dd6046/disk.config"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target dev="sda" bus="sata"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <readonly/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </disk>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="0" model="pcie-root"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="1" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="1" port="0x10"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="2" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="2" port="0x11"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="3" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="3" port="0x12"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="4" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="4" port="0x13"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="5" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="5" port="0x14"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="6" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="6" port="0x15"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="7" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="7" port="0x16"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="8" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="8" port="0x17"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="9" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="9" port="0x18"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="10" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="10" port="0x19"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="11" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="11" port="0x1a"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="12" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="12" port="0x1b"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="13" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="13" port="0x1c"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="14" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="14" port="0x1d"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="15" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="15" port="0x1e"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="16" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="16" port="0x1f"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="17" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="17" port="0x20"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="18" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="18" port="0x21"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="19" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="19" port="0x22"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="20" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="20" port="0x23"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="21" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="21" port="0x24"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="22" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="22" port="0x25"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="23" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="23" port="0x26"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="24" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="24" port="0x27"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="25" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="25" port="0x28"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-pci-bridge"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="usb" index="0" model="piix3-uhci">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="sata" index="0">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <interface type="ethernet"><mac address="fa:16:3e:d7:ea:cd"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap8a736c48-af"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </interface><serial type="pty">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <log file="/var/lib/nova/instances/21c0fdc0-71e1-403c-a34a-fce881dd6046/console.log" append="off"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target type="isa-serial" port="0">
Jan 26 15:25:31 compute-1 nova_compute[183403]:         <model name="isa-serial"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       </target>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </serial>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <console type="pty">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <log file="/var/lib/nova/instances/21c0fdc0-71e1-403c-a34a-fce881dd6046/console.log" append="off"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target type="serial" port="0"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </console>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <input type="tablet" bus="usb">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="usb" bus="0" port="1"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </input>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <input type="mouse" bus="ps2"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <listen type="address" address="::"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </graphics>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <video>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model type="virtio" heads="1" primary="yes"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </video>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <stats period="10"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </memballoon>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <rng model="virtio">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <backend model="random">/dev/urandom</backend>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </rng>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   </devices>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]: </domain>
Jan 26 15:25:31 compute-1 nova_compute[183403]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Jan 26 15:25:31 compute-1 nova_compute[183403]: 2026-01-26 15:25:31.344 183407 DEBUG nova.virt.libvirt.migration [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Jan 26 15:25:31 compute-1 nova_compute[183403]:   <name>instance-00000013</name>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   <uuid>21c0fdc0-71e1-403c-a34a-fce881dd6046</uuid>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   <metadata>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <nova:name>tempest-TestExecuteNodeResourceConsolidationStrategy-server-2057709501</nova:name>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <nova:creationTime>2026-01-26 15:24:48</nova:creationTime>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <nova:flavor name="m1.nano" id="74480e15-23e6-4569-8ef9-3ddf5ac8b981">
Jan 26 15:25:31 compute-1 nova_compute[183403]:         <nova:memory>128</nova:memory>
Jan 26 15:25:31 compute-1 nova_compute[183403]:         <nova:disk>1</nova:disk>
Jan 26 15:25:31 compute-1 nova_compute[183403]:         <nova:swap>0</nova:swap>
Jan 26 15:25:31 compute-1 nova_compute[183403]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 15:25:31 compute-1 nova_compute[183403]:         <nova:vcpus>1</nova:vcpus>
Jan 26 15:25:31 compute-1 nova_compute[183403]:         <nova:extraSpecs>
Jan 26 15:25:31 compute-1 nova_compute[183403]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 15:25:31 compute-1 nova_compute[183403]:         </nova:extraSpecs>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       </nova:flavor>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <nova:image uuid="354e4d0e-4287-404f-93d3-2c85cfe92fbc">
Jan 26 15:25:31 compute-1 nova_compute[183403]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 15:25:31 compute-1 nova_compute[183403]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 15:25:31 compute-1 nova_compute[183403]:         <nova:minDisk>1</nova:minDisk>
Jan 26 15:25:31 compute-1 nova_compute[183403]:         <nova:minRam>0</nova:minRam>
Jan 26 15:25:31 compute-1 nova_compute[183403]:         <nova:properties>
Jan 26 15:25:31 compute-1 nova_compute[183403]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 15:25:31 compute-1 nova_compute[183403]:         </nova:properties>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       </nova:image>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <nova:owner>
Jan 26 15:25:31 compute-1 nova_compute[183403]:         <nova:user uuid="0c77b3ed882642e3b0c7840dc8efc49a">tempest-TestExecuteNodeResourceConsolidationStrategy-1702173696-project-admin</nova:user>
Jan 26 15:25:31 compute-1 nova_compute[183403]:         <nova:project uuid="e07345fa9028494086d0d062e5c6d037">tempest-TestExecuteNodeResourceConsolidationStrategy-1702173696</nova:project>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       </nova:owner>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <nova:root type="image" uuid="354e4d0e-4287-404f-93d3-2c85cfe92fbc"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <nova:ports>
Jan 26 15:25:31 compute-1 nova_compute[183403]:         <nova:port uuid="8a736c48-af6c-4bb5-89e7-34d3cd2654df">
Jan 26 15:25:31 compute-1 nova_compute[183403]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:         </nova:port>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       </nova:ports>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </nova:instance>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   </metadata>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   <memory unit="KiB">131072</memory>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   <currentMemory unit="KiB">131072</currentMemory>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   <vcpu placement="static">1</vcpu>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   <resource>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <partition>/machine</partition>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   </resource>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   <sysinfo type="smbios">
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <system>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <entry name="manufacturer">RDO</entry>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <entry name="product">OpenStack Compute</entry>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <entry name="serial">21c0fdc0-71e1-403c-a34a-fce881dd6046</entry>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <entry name="uuid">21c0fdc0-71e1-403c-a34a-fce881dd6046</entry>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <entry name="family">Virtual Machine</entry>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </system>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   </sysinfo>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   <os>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <boot dev="hd"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <smbios mode="sysinfo"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   </os>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   <features>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <acpi/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <apic/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <vmcoreinfo state="on"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   </features>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   <cpu mode="custom" match="exact" check="partial">
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <model fallback="allow">Nehalem</model>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   </cpu>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   <clock offset="utc">
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <timer name="hpet" present="no"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   </clock>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   <on_poweroff>destroy</on_poweroff>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   <on_reboot>restart</on_reboot>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   <on_crash>destroy</on_crash>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   <devices>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <disk type="file" device="disk">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <source file="/var/lib/nova/instances/21c0fdc0-71e1-403c-a34a-fce881dd6046/disk"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target dev="vda" bus="virtio"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </disk>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <disk type="file" device="cdrom">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <source file="/var/lib/nova/instances/21c0fdc0-71e1-403c-a34a-fce881dd6046/disk.config"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target dev="sda" bus="sata"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <readonly/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </disk>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="0" model="pcie-root"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="1" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="1" port="0x10"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="2" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="2" port="0x11"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="3" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="3" port="0x12"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="4" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="4" port="0x13"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="5" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="5" port="0x14"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="6" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="6" port="0x15"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="7" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="7" port="0x16"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="8" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="8" port="0x17"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="9" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="9" port="0x18"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="10" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="10" port="0x19"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="11" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="11" port="0x1a"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="12" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="12" port="0x1b"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="13" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="13" port="0x1c"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="14" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="14" port="0x1d"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="15" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="15" port="0x1e"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="16" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="16" port="0x1f"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="17" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="17" port="0x20"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="18" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="18" port="0x21"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="19" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="19" port="0x22"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="20" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="20" port="0x23"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="21" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="21" port="0x24"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="22" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="22" port="0x25"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="23" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="23" port="0x26"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="24" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="24" port="0x27"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="25" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="25" port="0x28"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-pci-bridge"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="usb" index="0" model="piix3-uhci">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="sata" index="0">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <interface type="ethernet"><mac address="fa:16:3e:d7:ea:cd"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap8a736c48-af"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </interface><serial type="pty">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <log file="/var/lib/nova/instances/21c0fdc0-71e1-403c-a34a-fce881dd6046/console.log" append="off"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target type="isa-serial" port="0">
Jan 26 15:25:31 compute-1 nova_compute[183403]:         <model name="isa-serial"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       </target>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </serial>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <console type="pty">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <log file="/var/lib/nova/instances/21c0fdc0-71e1-403c-a34a-fce881dd6046/console.log" append="off"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target type="serial" port="0"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </console>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <input type="tablet" bus="usb">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="usb" bus="0" port="1"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </input>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <input type="mouse" bus="ps2"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <listen type="address" address="::"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </graphics>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <video>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model type="virtio" heads="1" primary="yes"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </video>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <stats period="10"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </memballoon>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <rng model="virtio">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <backend model="random">/dev/urandom</backend>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </rng>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   </devices>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]: </domain>
Jan 26 15:25:31 compute-1 nova_compute[183403]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Jan 26 15:25:31 compute-1 nova_compute[183403]: 2026-01-26 15:25:31.344 183407 DEBUG nova.virt.libvirt.migration [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] _update_pci_xml output xml=<domain type="kvm">
Jan 26 15:25:31 compute-1 nova_compute[183403]:   <name>instance-00000013</name>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   <uuid>21c0fdc0-71e1-403c-a34a-fce881dd6046</uuid>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   <metadata>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <nova:name>tempest-TestExecuteNodeResourceConsolidationStrategy-server-2057709501</nova:name>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <nova:creationTime>2026-01-26 15:24:48</nova:creationTime>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <nova:flavor name="m1.nano" id="74480e15-23e6-4569-8ef9-3ddf5ac8b981">
Jan 26 15:25:31 compute-1 nova_compute[183403]:         <nova:memory>128</nova:memory>
Jan 26 15:25:31 compute-1 nova_compute[183403]:         <nova:disk>1</nova:disk>
Jan 26 15:25:31 compute-1 nova_compute[183403]:         <nova:swap>0</nova:swap>
Jan 26 15:25:31 compute-1 nova_compute[183403]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 15:25:31 compute-1 nova_compute[183403]:         <nova:vcpus>1</nova:vcpus>
Jan 26 15:25:31 compute-1 nova_compute[183403]:         <nova:extraSpecs>
Jan 26 15:25:31 compute-1 nova_compute[183403]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 15:25:31 compute-1 nova_compute[183403]:         </nova:extraSpecs>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       </nova:flavor>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <nova:image uuid="354e4d0e-4287-404f-93d3-2c85cfe92fbc">
Jan 26 15:25:31 compute-1 nova_compute[183403]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 15:25:31 compute-1 nova_compute[183403]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 15:25:31 compute-1 nova_compute[183403]:         <nova:minDisk>1</nova:minDisk>
Jan 26 15:25:31 compute-1 nova_compute[183403]:         <nova:minRam>0</nova:minRam>
Jan 26 15:25:31 compute-1 nova_compute[183403]:         <nova:properties>
Jan 26 15:25:31 compute-1 nova_compute[183403]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 15:25:31 compute-1 nova_compute[183403]:         </nova:properties>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       </nova:image>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <nova:owner>
Jan 26 15:25:31 compute-1 nova_compute[183403]:         <nova:user uuid="0c77b3ed882642e3b0c7840dc8efc49a">tempest-TestExecuteNodeResourceConsolidationStrategy-1702173696-project-admin</nova:user>
Jan 26 15:25:31 compute-1 nova_compute[183403]:         <nova:project uuid="e07345fa9028494086d0d062e5c6d037">tempest-TestExecuteNodeResourceConsolidationStrategy-1702173696</nova:project>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       </nova:owner>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <nova:root type="image" uuid="354e4d0e-4287-404f-93d3-2c85cfe92fbc"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <nova:ports>
Jan 26 15:25:31 compute-1 nova_compute[183403]:         <nova:port uuid="8a736c48-af6c-4bb5-89e7-34d3cd2654df">
Jan 26 15:25:31 compute-1 nova_compute[183403]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:         </nova:port>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       </nova:ports>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </nova:instance>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   </metadata>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   <memory unit="KiB">131072</memory>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   <currentMemory unit="KiB">131072</currentMemory>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   <vcpu placement="static">1</vcpu>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   <resource>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <partition>/machine</partition>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   </resource>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   <sysinfo type="smbios">
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <system>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <entry name="manufacturer">RDO</entry>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <entry name="product">OpenStack Compute</entry>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <entry name="serial">21c0fdc0-71e1-403c-a34a-fce881dd6046</entry>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <entry name="uuid">21c0fdc0-71e1-403c-a34a-fce881dd6046</entry>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <entry name="family">Virtual Machine</entry>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </system>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   </sysinfo>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   <os>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <boot dev="hd"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <smbios mode="sysinfo"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   </os>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   <features>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <acpi/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <apic/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <vmcoreinfo state="on"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   </features>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   <cpu mode="custom" match="exact" check="partial">
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <model fallback="allow">Nehalem</model>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   </cpu>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   <clock offset="utc">
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <timer name="hpet" present="no"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   </clock>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   <on_poweroff>destroy</on_poweroff>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   <on_reboot>restart</on_reboot>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   <on_crash>destroy</on_crash>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   <devices>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <disk type="file" device="disk">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <source file="/var/lib/nova/instances/21c0fdc0-71e1-403c-a34a-fce881dd6046/disk"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target dev="vda" bus="virtio"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </disk>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <disk type="file" device="cdrom">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <source file="/var/lib/nova/instances/21c0fdc0-71e1-403c-a34a-fce881dd6046/disk.config"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target dev="sda" bus="sata"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <readonly/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </disk>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="0" model="pcie-root"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="1" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="1" port="0x10"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="2" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="2" port="0x11"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="3" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="3" port="0x12"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="4" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="4" port="0x13"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="5" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="5" port="0x14"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="6" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="6" port="0x15"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="7" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="7" port="0x16"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="8" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="8" port="0x17"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="9" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="9" port="0x18"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="10" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="10" port="0x19"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="11" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="11" port="0x1a"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="12" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="12" port="0x1b"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="13" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="13" port="0x1c"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="14" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="14" port="0x1d"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="15" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="15" port="0x1e"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="16" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="16" port="0x1f"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="17" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="17" port="0x20"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="18" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="18" port="0x21"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="19" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="19" port="0x22"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="20" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="20" port="0x23"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="21" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="21" port="0x24"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="22" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="22" port="0x25"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="23" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="23" port="0x26"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="24" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="24" port="0x27"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="25" model="pcie-root-port">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target chassis="25" port="0x28"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model name="pcie-pci-bridge"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="usb" index="0" model="piix3-uhci">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <controller type="sata" index="0">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <interface type="ethernet"><mac address="fa:16:3e:d7:ea:cd"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap8a736c48-af"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </interface><serial type="pty">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <log file="/var/lib/nova/instances/21c0fdc0-71e1-403c-a34a-fce881dd6046/console.log" append="off"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target type="isa-serial" port="0">
Jan 26 15:25:31 compute-1 nova_compute[183403]:         <model name="isa-serial"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       </target>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </serial>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <console type="pty">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <log file="/var/lib/nova/instances/21c0fdc0-71e1-403c-a34a-fce881dd6046/console.log" append="off"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <target type="serial" port="0"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </console>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <input type="tablet" bus="usb">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="usb" bus="0" port="1"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </input>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <input type="mouse" bus="ps2"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <listen type="address" address="::"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </graphics>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <video>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <model type="virtio" heads="1" primary="yes"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </video>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <stats period="10"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </memballoon>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     <rng model="virtio">
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <backend model="random">/dev/urandom</backend>
Jan 26 15:25:31 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]:     </rng>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   </devices>
Jan 26 15:25:31 compute-1 nova_compute[183403]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Jan 26 15:25:31 compute-1 nova_compute[183403]: </domain>
Jan 26 15:25:31 compute-1 nova_compute[183403]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Jan 26 15:25:31 compute-1 nova_compute[183403]: 2026-01-26 15:25:31.345 183407 DEBUG nova.virt.libvirt.driver [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Jan 26 15:25:31 compute-1 nova_compute[183403]: 2026-01-26 15:25:31.372 183407 DEBUG nova.network.neutron [req-7612b1ed-6865-4bc3-b42c-b566268a6c2e req-54ba6f74-6662-4390-b2b0-b92b8c1db4d0 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Updated VIF entry in instance network info cache for port 8a736c48-af6c-4bb5-89e7-34d3cd2654df. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Jan 26 15:25:31 compute-1 nova_compute[183403]: 2026-01-26 15:25:31.373 183407 DEBUG nova.network.neutron [req-7612b1ed-6865-4bc3-b42c-b566268a6c2e req-54ba6f74-6662-4390-b2b0-b92b8c1db4d0 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Updating instance_info_cache with network_info: [{"id": "8a736c48-af6c-4bb5-89e7-34d3cd2654df", "address": "fa:16:3e:d7:ea:cd", "network": {"id": "3e9dfa8e-4100-40c2-b5c3-611e27e3b601", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-854362176-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9147aacc62b34487b1727939f1a92703", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a736c48-af", "ovs_interfaceid": "8a736c48-af6c-4bb5-89e7-34d3cd2654df", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:25:31 compute-1 nova_compute[183403]: 2026-01-26 15:25:31.839 183407 DEBUG nova.virt.libvirt.migration [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Current None elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Jan 26 15:25:31 compute-1 nova_compute[183403]: 2026-01-26 15:25:31.839 183407 INFO nova.virt.libvirt.migration [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Increasing downtime to 50 ms after 0 sec elapsed time
Jan 26 15:25:31 compute-1 nova_compute[183403]: 2026-01-26 15:25:31.882 183407 DEBUG oslo_concurrency.lockutils [req-7612b1ed-6865-4bc3-b42c-b566268a6c2e req-54ba6f74-6662-4390-b2b0-b92b8c1db4d0 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Releasing lock "refresh_cache-21c0fdc0-71e1-403c-a34a-fce881dd6046" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 15:25:32 compute-1 nova_compute[183403]: 2026-01-26 15:25:32.863 183407 INFO nova.virt.libvirt.driver [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Jan 26 15:25:33 compute-1 nova_compute[183403]: 2026-01-26 15:25:33.368 183407 DEBUG nova.virt.libvirt.migration [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Jan 26 15:25:33 compute-1 nova_compute[183403]: 2026-01-26 15:25:33.369 183407 DEBUG nova.virt.libvirt.migration [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Jan 26 15:25:33 compute-1 kernel: tap8a736c48-af (unregistering): left promiscuous mode
Jan 26 15:25:33 compute-1 NetworkManager[55716]: <info>  [1769441133.8234] device (tap8a736c48-af): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 15:25:33 compute-1 ovn_controller[95641]: 2026-01-26T15:25:33Z|00159|binding|INFO|Releasing lport 8a736c48-af6c-4bb5-89e7-34d3cd2654df from this chassis (sb_readonly=0)
Jan 26 15:25:33 compute-1 ovn_controller[95641]: 2026-01-26T15:25:33Z|00160|binding|INFO|Setting lport 8a736c48-af6c-4bb5-89e7-34d3cd2654df down in Southbound
Jan 26 15:25:33 compute-1 ovn_controller[95641]: 2026-01-26T15:25:33Z|00161|binding|INFO|Removing iface tap8a736c48-af ovn-installed in OVS
Jan 26 15:25:33 compute-1 nova_compute[183403]: 2026-01-26 15:25:33.833 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:25:33 compute-1 nova_compute[183403]: 2026-01-26 15:25:33.836 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:25:33 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:25:33.841 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d7:ea:cd 10.100.0.5'], port_security=['fa:16:3e:d7:ea:cd 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '3e0272b2-d627-4653-a221-12286e3af322'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '21c0fdc0-71e1-403c-a34a-fce881dd6046', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3e9dfa8e-4100-40c2-b5c3-611e27e3b601', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e07345fa9028494086d0d062e5c6d037', 'neutron:revision_number': '10', 'neutron:security_group_ids': '965d4ca4-c12d-4e4f-bc5a-34d61b5f1699', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9d8c1c3b-77a7-4237-87f3-33f734737e5d, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], logical_port=8a736c48-af6c-4bb5-89e7-34d3cd2654df) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:25:33 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:25:33.842 104930 INFO neutron.agent.ovn.metadata.agent [-] Port 8a736c48-af6c-4bb5-89e7-34d3cd2654df in datapath 3e9dfa8e-4100-40c2-b5c3-611e27e3b601 unbound from our chassis
Jan 26 15:25:33 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:25:33.843 104930 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3e9dfa8e-4100-40c2-b5c3-611e27e3b601, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 15:25:33 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:25:33.845 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[d4786644-9ae8-4025-9909-3c7491ff5f08]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:25:33 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:25:33.845 104930 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3e9dfa8e-4100-40c2-b5c3-611e27e3b601 namespace which is not needed anymore
Jan 26 15:25:33 compute-1 nova_compute[183403]: 2026-01-26 15:25:33.853 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:25:33 compute-1 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000013.scope: Deactivated successfully.
Jan 26 15:25:33 compute-1 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000013.scope: Consumed 14.207s CPU time.
Jan 26 15:25:33 compute-1 systemd-machined[154697]: Machine qemu-14-instance-00000013 terminated.
Jan 26 15:25:33 compute-1 neutron-haproxy-ovnmeta-3e9dfa8e-4100-40c2-b5c3-611e27e3b601[210662]: [NOTICE]   (210666) : haproxy version is 3.0.5-8e879a5
Jan 26 15:25:33 compute-1 neutron-haproxy-ovnmeta-3e9dfa8e-4100-40c2-b5c3-611e27e3b601[210662]: [NOTICE]   (210666) : path to executable is /usr/sbin/haproxy
Jan 26 15:25:33 compute-1 neutron-haproxy-ovnmeta-3e9dfa8e-4100-40c2-b5c3-611e27e3b601[210662]: [WARNING]  (210666) : Exiting Master process...
Jan 26 15:25:33 compute-1 podman[210829]: 2026-01-26 15:25:33.998957619 +0000 UTC m=+0.050074066 container kill 21241bbcd866399780d30fef6fa5e96d88b0373652cda1657c79bf8919814791 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-3e9dfa8e-4100-40c2-b5c3-611e27e3b601, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20260120)
Jan 26 15:25:34 compute-1 neutron-haproxy-ovnmeta-3e9dfa8e-4100-40c2-b5c3-611e27e3b601[210662]: [ALERT]    (210666) : Current worker (210668) exited with code 143 (Terminated)
Jan 26 15:25:34 compute-1 neutron-haproxy-ovnmeta-3e9dfa8e-4100-40c2-b5c3-611e27e3b601[210662]: [WARNING]  (210666) : All workers exited. Exiting... (0)
Jan 26 15:25:34 compute-1 systemd[1]: libpod-21241bbcd866399780d30fef6fa5e96d88b0373652cda1657c79bf8919814791.scope: Deactivated successfully.
Jan 26 15:25:34 compute-1 podman[210844]: 2026-01-26 15:25:34.062764951 +0000 UTC m=+0.034602734 container died 21241bbcd866399780d30fef6fa5e96d88b0373652cda1657c79bf8919814791 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-3e9dfa8e-4100-40c2-b5c3-611e27e3b601, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 15:25:34 compute-1 nova_compute[183403]: 2026-01-26 15:25:34.092 183407 DEBUG nova.virt.libvirt.guest [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Jan 26 15:25:34 compute-1 nova_compute[183403]: 2026-01-26 15:25:34.092 183407 INFO nova.virt.libvirt.driver [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Migration operation has completed
Jan 26 15:25:34 compute-1 nova_compute[183403]: 2026-01-26 15:25:34.093 183407 INFO nova.compute.manager [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] _post_live_migration() is started..
Jan 26 15:25:34 compute-1 nova_compute[183403]: 2026-01-26 15:25:34.097 183407 DEBUG nova.virt.libvirt.driver [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Jan 26 15:25:34 compute-1 nova_compute[183403]: 2026-01-26 15:25:34.098 183407 DEBUG nova.virt.libvirt.driver [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Jan 26 15:25:34 compute-1 nova_compute[183403]: 2026-01-26 15:25:34.098 183407 DEBUG nova.virt.libvirt.driver [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Jan 26 15:25:34 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-21241bbcd866399780d30fef6fa5e96d88b0373652cda1657c79bf8919814791-userdata-shm.mount: Deactivated successfully.
Jan 26 15:25:34 compute-1 systemd[1]: var-lib-containers-storage-overlay-b5ed0b84901436d27fa66af3db713ebee895466497b91b60aa2f9de0ef61c4b7-merged.mount: Deactivated successfully.
Jan 26 15:25:34 compute-1 nova_compute[183403]: 2026-01-26 15:25:34.113 183407 WARNING neutronclient.v2_0.client [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:25:34 compute-1 podman[210844]: 2026-01-26 15:25:34.113597588 +0000 UTC m=+0.085435341 container cleanup 21241bbcd866399780d30fef6fa5e96d88b0373652cda1657c79bf8919814791 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-3e9dfa8e-4100-40c2-b5c3-611e27e3b601, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest)
Jan 26 15:25:34 compute-1 nova_compute[183403]: 2026-01-26 15:25:34.113 183407 WARNING neutronclient.v2_0.client [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:25:34 compute-1 systemd[1]: libpod-conmon-21241bbcd866399780d30fef6fa5e96d88b0373652cda1657c79bf8919814791.scope: Deactivated successfully.
Jan 26 15:25:34 compute-1 nova_compute[183403]: 2026-01-26 15:25:34.122 183407 DEBUG nova.compute.manager [req-3a1e4f62-b5d6-4de2-a09c-5ff30d648da5 req-7553e56d-683c-4478-a340-c6c55b687f17 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Received event network-vif-unplugged-8a736c48-af6c-4bb5-89e7-34d3cd2654df external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:25:34 compute-1 nova_compute[183403]: 2026-01-26 15:25:34.122 183407 DEBUG oslo_concurrency.lockutils [req-3a1e4f62-b5d6-4de2-a09c-5ff30d648da5 req-7553e56d-683c-4478-a340-c6c55b687f17 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "21c0fdc0-71e1-403c-a34a-fce881dd6046-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:25:34 compute-1 nova_compute[183403]: 2026-01-26 15:25:34.123 183407 DEBUG oslo_concurrency.lockutils [req-3a1e4f62-b5d6-4de2-a09c-5ff30d648da5 req-7553e56d-683c-4478-a340-c6c55b687f17 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "21c0fdc0-71e1-403c-a34a-fce881dd6046-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:25:34 compute-1 nova_compute[183403]: 2026-01-26 15:25:34.123 183407 DEBUG oslo_concurrency.lockutils [req-3a1e4f62-b5d6-4de2-a09c-5ff30d648da5 req-7553e56d-683c-4478-a340-c6c55b687f17 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "21c0fdc0-71e1-403c-a34a-fce881dd6046-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:25:34 compute-1 nova_compute[183403]: 2026-01-26 15:25:34.123 183407 DEBUG nova.compute.manager [req-3a1e4f62-b5d6-4de2-a09c-5ff30d648da5 req-7553e56d-683c-4478-a340-c6c55b687f17 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] No waiting events found dispatching network-vif-unplugged-8a736c48-af6c-4bb5-89e7-34d3cd2654df pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:25:34 compute-1 nova_compute[183403]: 2026-01-26 15:25:34.123 183407 DEBUG nova.compute.manager [req-3a1e4f62-b5d6-4de2-a09c-5ff30d648da5 req-7553e56d-683c-4478-a340-c6c55b687f17 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Received event network-vif-unplugged-8a736c48-af6c-4bb5-89e7-34d3cd2654df for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 15:25:34 compute-1 podman[210847]: 2026-01-26 15:25:34.137556817 +0000 UTC m=+0.104978885 container remove 21241bbcd866399780d30fef6fa5e96d88b0373652cda1657c79bf8919814791 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-3e9dfa8e-4100-40c2-b5c3-611e27e3b601, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20260120)
Jan 26 15:25:34 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:25:34.153 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[2b56a73c-f5ba-4962-885c-c5ab050d67ee]: (4, ("Mon Jan 26 03:25:33 PM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-3e9dfa8e-4100-40c2-b5c3-611e27e3b601 (21241bbcd866399780d30fef6fa5e96d88b0373652cda1657c79bf8919814791)\n21241bbcd866399780d30fef6fa5e96d88b0373652cda1657c79bf8919814791\nMon Jan 26 03:25:34 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-3e9dfa8e-4100-40c2-b5c3-611e27e3b601 (21241bbcd866399780d30fef6fa5e96d88b0373652cda1657c79bf8919814791)\n21241bbcd866399780d30fef6fa5e96d88b0373652cda1657c79bf8919814791\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:25:34 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:25:34.155 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[6d53a43f-b672-40b5-a3bf-a761894f29c5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:25:34 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:25:34.156 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3e9dfa8e-4100-40c2-b5c3-611e27e3b601.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3e9dfa8e-4100-40c2-b5c3-611e27e3b601.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:25:34 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:25:34.157 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[590f53f3-fd4e-432f-a4de-6e9e0003c1e1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:25:34 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:25:34.158 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3e9dfa8e-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:25:34 compute-1 nova_compute[183403]: 2026-01-26 15:25:34.161 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:25:34 compute-1 kernel: tap3e9dfa8e-40: left promiscuous mode
Jan 26 15:25:34 compute-1 nova_compute[183403]: 2026-01-26 15:25:34.183 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:25:34 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:25:34.187 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[e1073e69-a32d-44f5-9e45-0cece4777772]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:25:34 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:25:34.203 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[f5e8971c-9bbd-4ecc-b8fc-336ca95af800]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:25:34 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:25:34.204 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[551d1643-2c70-461e-8b3e-ab2f0f86e31e]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:25:34 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:25:34.230 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[5d085d96-534e-4cbd-a37a-ef84a118c52e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 481722, 'reachable_time': 34699, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210895, 'error': None, 'target': 'ovnmeta-3e9dfa8e-4100-40c2-b5c3-611e27e3b601', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:25:34 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:25:34.237 105448 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3e9dfa8e-4100-40c2-b5c3-611e27e3b601 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Jan 26 15:25:34 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:25:34.237 105448 DEBUG oslo.privsep.daemon [-] privsep: reply[88cf1e36-fe07-4619-86c4-962c3f14a49f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:25:34 compute-1 systemd[1]: run-netns-ovnmeta\x2d3e9dfa8e\x2d4100\x2d40c2\x2db5c3\x2d611e27e3b601.mount: Deactivated successfully.
Jan 26 15:25:34 compute-1 nova_compute[183403]: 2026-01-26 15:25:34.322 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:25:34 compute-1 nova_compute[183403]: 2026-01-26 15:25:34.545 183407 DEBUG nova.network.neutron [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Activated binding for port 8a736c48-af6c-4bb5-89e7-34d3cd2654df and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Jan 26 15:25:34 compute-1 nova_compute[183403]: 2026-01-26 15:25:34.546 183407 DEBUG nova.compute.manager [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "8a736c48-af6c-4bb5-89e7-34d3cd2654df", "address": "fa:16:3e:d7:ea:cd", "network": {"id": "3e9dfa8e-4100-40c2-b5c3-611e27e3b601", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-854362176-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9147aacc62b34487b1727939f1a92703", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a736c48-af", "ovs_interfaceid": "8a736c48-af6c-4bb5-89e7-34d3cd2654df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10063
Jan 26 15:25:34 compute-1 nova_compute[183403]: 2026-01-26 15:25:34.547 183407 DEBUG nova.virt.libvirt.vif [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2026-01-26T15:24:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-2057709501',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-205',id=19,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T15:24:57Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e07345fa9028494086d0d062e5c6d037',ramdisk_id='',reservation_id='r-e0xp661h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1702173696',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1702173696-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T15:25:10Z,user_data=None,user_id='0c77b3ed882642e3b0c7840dc8efc49a',uuid=21c0fdc0-71e1-403c-a34a-fce881dd6046,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8a736c48-af6c-4bb5-89e7-34d3cd2654df", "address": "fa:16:3e:d7:ea:cd", "network": {"id": "3e9dfa8e-4100-40c2-b5c3-611e27e3b601", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-854362176-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9147aacc62b34487b1727939f1a92703", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a736c48-af", "ovs_interfaceid": "8a736c48-af6c-4bb5-89e7-34d3cd2654df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 26 15:25:34 compute-1 nova_compute[183403]: 2026-01-26 15:25:34.548 183407 DEBUG nova.network.os_vif_util [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Converting VIF {"id": "8a736c48-af6c-4bb5-89e7-34d3cd2654df", "address": "fa:16:3e:d7:ea:cd", "network": {"id": "3e9dfa8e-4100-40c2-b5c3-611e27e3b601", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-854362176-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9147aacc62b34487b1727939f1a92703", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a736c48-af", "ovs_interfaceid": "8a736c48-af6c-4bb5-89e7-34d3cd2654df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:25:34 compute-1 nova_compute[183403]: 2026-01-26 15:25:34.549 183407 DEBUG nova.network.os_vif_util [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d7:ea:cd,bridge_name='br-int',has_traffic_filtering=True,id=8a736c48-af6c-4bb5-89e7-34d3cd2654df,network=Network(3e9dfa8e-4100-40c2-b5c3-611e27e3b601),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a736c48-af') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:25:34 compute-1 nova_compute[183403]: 2026-01-26 15:25:34.549 183407 DEBUG os_vif [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d7:ea:cd,bridge_name='br-int',has_traffic_filtering=True,id=8a736c48-af6c-4bb5-89e7-34d3cd2654df,network=Network(3e9dfa8e-4100-40c2-b5c3-611e27e3b601),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a736c48-af') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 26 15:25:34 compute-1 nova_compute[183403]: 2026-01-26 15:25:34.553 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:25:34 compute-1 nova_compute[183403]: 2026-01-26 15:25:34.553 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8a736c48-af, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:25:34 compute-1 nova_compute[183403]: 2026-01-26 15:25:34.557 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:25:34 compute-1 nova_compute[183403]: 2026-01-26 15:25:34.559 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:25:34 compute-1 nova_compute[183403]: 2026-01-26 15:25:34.559 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=17ea71e1-f115-47d5-827c-7ad31eddaabb) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:25:34 compute-1 nova_compute[183403]: 2026-01-26 15:25:34.564 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:25:34 compute-1 nova_compute[183403]: 2026-01-26 15:25:34.567 183407 INFO os_vif [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d7:ea:cd,bridge_name='br-int',has_traffic_filtering=True,id=8a736c48-af6c-4bb5-89e7-34d3cd2654df,network=Network(3e9dfa8e-4100-40c2-b5c3-611e27e3b601),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a736c48-af')
Jan 26 15:25:34 compute-1 nova_compute[183403]: 2026-01-26 15:25:34.568 183407 DEBUG oslo_concurrency.lockutils [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:25:34 compute-1 nova_compute[183403]: 2026-01-26 15:25:34.568 183407 DEBUG oslo_concurrency.lockutils [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:25:34 compute-1 nova_compute[183403]: 2026-01-26 15:25:34.568 183407 DEBUG oslo_concurrency.lockutils [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:25:34 compute-1 nova_compute[183403]: 2026-01-26 15:25:34.569 183407 DEBUG nova.compute.manager [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10086
Jan 26 15:25:34 compute-1 nova_compute[183403]: 2026-01-26 15:25:34.569 183407 INFO nova.virt.libvirt.driver [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Deleting instance files /var/lib/nova/instances/21c0fdc0-71e1-403c-a34a-fce881dd6046_del
Jan 26 15:25:34 compute-1 nova_compute[183403]: 2026-01-26 15:25:34.570 183407 INFO nova.virt.libvirt.driver [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Deletion of /var/lib/nova/instances/21c0fdc0-71e1-403c-a34a-fce881dd6046_del complete
Jan 26 15:25:34 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:25:34.684 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=41380b5a-e321-4ce4-bcc6-ecd563b3c793, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:25:35 compute-1 podman[192725]: time="2026-01-26T15:25:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:25:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:25:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:25:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:25:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2192 "" "Go-http-client/1.1"
Jan 26 15:25:35 compute-1 nova_compute[183403]: 2026-01-26 15:25:35.746 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:25:36 compute-1 nova_compute[183403]: 2026-01-26 15:25:36.313 183407 DEBUG nova.compute.manager [req-eee773ed-ccc2-4804-bbf5-e205de5306a3 req-da365e05-7903-4072-a29e-cd8ddd8441a7 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Received event network-vif-plugged-8a736c48-af6c-4bb5-89e7-34d3cd2654df external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:25:36 compute-1 nova_compute[183403]: 2026-01-26 15:25:36.313 183407 DEBUG oslo_concurrency.lockutils [req-eee773ed-ccc2-4804-bbf5-e205de5306a3 req-da365e05-7903-4072-a29e-cd8ddd8441a7 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "21c0fdc0-71e1-403c-a34a-fce881dd6046-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:25:36 compute-1 nova_compute[183403]: 2026-01-26 15:25:36.314 183407 DEBUG oslo_concurrency.lockutils [req-eee773ed-ccc2-4804-bbf5-e205de5306a3 req-da365e05-7903-4072-a29e-cd8ddd8441a7 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "21c0fdc0-71e1-403c-a34a-fce881dd6046-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:25:36 compute-1 nova_compute[183403]: 2026-01-26 15:25:36.314 183407 DEBUG oslo_concurrency.lockutils [req-eee773ed-ccc2-4804-bbf5-e205de5306a3 req-da365e05-7903-4072-a29e-cd8ddd8441a7 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "21c0fdc0-71e1-403c-a34a-fce881dd6046-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:25:36 compute-1 nova_compute[183403]: 2026-01-26 15:25:36.314 183407 DEBUG nova.compute.manager [req-eee773ed-ccc2-4804-bbf5-e205de5306a3 req-da365e05-7903-4072-a29e-cd8ddd8441a7 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] No waiting events found dispatching network-vif-plugged-8a736c48-af6c-4bb5-89e7-34d3cd2654df pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:25:36 compute-1 nova_compute[183403]: 2026-01-26 15:25:36.314 183407 WARNING nova.compute.manager [req-eee773ed-ccc2-4804-bbf5-e205de5306a3 req-da365e05-7903-4072-a29e-cd8ddd8441a7 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Received unexpected event network-vif-plugged-8a736c48-af6c-4bb5-89e7-34d3cd2654df for instance with vm_state active and task_state migrating.
Jan 26 15:25:36 compute-1 nova_compute[183403]: 2026-01-26 15:25:36.315 183407 DEBUG nova.compute.manager [req-eee773ed-ccc2-4804-bbf5-e205de5306a3 req-da365e05-7903-4072-a29e-cd8ddd8441a7 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Received event network-vif-unplugged-8a736c48-af6c-4bb5-89e7-34d3cd2654df external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:25:36 compute-1 nova_compute[183403]: 2026-01-26 15:25:36.315 183407 DEBUG oslo_concurrency.lockutils [req-eee773ed-ccc2-4804-bbf5-e205de5306a3 req-da365e05-7903-4072-a29e-cd8ddd8441a7 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "21c0fdc0-71e1-403c-a34a-fce881dd6046-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:25:36 compute-1 nova_compute[183403]: 2026-01-26 15:25:36.315 183407 DEBUG oslo_concurrency.lockutils [req-eee773ed-ccc2-4804-bbf5-e205de5306a3 req-da365e05-7903-4072-a29e-cd8ddd8441a7 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "21c0fdc0-71e1-403c-a34a-fce881dd6046-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:25:36 compute-1 nova_compute[183403]: 2026-01-26 15:25:36.316 183407 DEBUG oslo_concurrency.lockutils [req-eee773ed-ccc2-4804-bbf5-e205de5306a3 req-da365e05-7903-4072-a29e-cd8ddd8441a7 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "21c0fdc0-71e1-403c-a34a-fce881dd6046-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:25:36 compute-1 nova_compute[183403]: 2026-01-26 15:25:36.316 183407 DEBUG nova.compute.manager [req-eee773ed-ccc2-4804-bbf5-e205de5306a3 req-da365e05-7903-4072-a29e-cd8ddd8441a7 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] No waiting events found dispatching network-vif-unplugged-8a736c48-af6c-4bb5-89e7-34d3cd2654df pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:25:36 compute-1 nova_compute[183403]: 2026-01-26 15:25:36.316 183407 DEBUG nova.compute.manager [req-eee773ed-ccc2-4804-bbf5-e205de5306a3 req-da365e05-7903-4072-a29e-cd8ddd8441a7 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Received event network-vif-unplugged-8a736c48-af6c-4bb5-89e7-34d3cd2654df for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 15:25:36 compute-1 nova_compute[183403]: 2026-01-26 15:25:36.316 183407 DEBUG nova.compute.manager [req-eee773ed-ccc2-4804-bbf5-e205de5306a3 req-da365e05-7903-4072-a29e-cd8ddd8441a7 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Received event network-vif-unplugged-8a736c48-af6c-4bb5-89e7-34d3cd2654df external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:25:36 compute-1 nova_compute[183403]: 2026-01-26 15:25:36.317 183407 DEBUG oslo_concurrency.lockutils [req-eee773ed-ccc2-4804-bbf5-e205de5306a3 req-da365e05-7903-4072-a29e-cd8ddd8441a7 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "21c0fdc0-71e1-403c-a34a-fce881dd6046-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:25:36 compute-1 nova_compute[183403]: 2026-01-26 15:25:36.317 183407 DEBUG oslo_concurrency.lockutils [req-eee773ed-ccc2-4804-bbf5-e205de5306a3 req-da365e05-7903-4072-a29e-cd8ddd8441a7 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "21c0fdc0-71e1-403c-a34a-fce881dd6046-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:25:36 compute-1 nova_compute[183403]: 2026-01-26 15:25:36.317 183407 DEBUG oslo_concurrency.lockutils [req-eee773ed-ccc2-4804-bbf5-e205de5306a3 req-da365e05-7903-4072-a29e-cd8ddd8441a7 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "21c0fdc0-71e1-403c-a34a-fce881dd6046-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:25:36 compute-1 nova_compute[183403]: 2026-01-26 15:25:36.317 183407 DEBUG nova.compute.manager [req-eee773ed-ccc2-4804-bbf5-e205de5306a3 req-da365e05-7903-4072-a29e-cd8ddd8441a7 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] No waiting events found dispatching network-vif-unplugged-8a736c48-af6c-4bb5-89e7-34d3cd2654df pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:25:36 compute-1 nova_compute[183403]: 2026-01-26 15:25:36.318 183407 DEBUG nova.compute.manager [req-eee773ed-ccc2-4804-bbf5-e205de5306a3 req-da365e05-7903-4072-a29e-cd8ddd8441a7 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Received event network-vif-unplugged-8a736c48-af6c-4bb5-89e7-34d3cd2654df for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 15:25:36 compute-1 nova_compute[183403]: 2026-01-26 15:25:36.318 183407 DEBUG nova.compute.manager [req-eee773ed-ccc2-4804-bbf5-e205de5306a3 req-da365e05-7903-4072-a29e-cd8ddd8441a7 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Received event network-vif-plugged-8a736c48-af6c-4bb5-89e7-34d3cd2654df external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:25:36 compute-1 nova_compute[183403]: 2026-01-26 15:25:36.318 183407 DEBUG oslo_concurrency.lockutils [req-eee773ed-ccc2-4804-bbf5-e205de5306a3 req-da365e05-7903-4072-a29e-cd8ddd8441a7 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "21c0fdc0-71e1-403c-a34a-fce881dd6046-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:25:36 compute-1 nova_compute[183403]: 2026-01-26 15:25:36.318 183407 DEBUG oslo_concurrency.lockutils [req-eee773ed-ccc2-4804-bbf5-e205de5306a3 req-da365e05-7903-4072-a29e-cd8ddd8441a7 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "21c0fdc0-71e1-403c-a34a-fce881dd6046-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:25:36 compute-1 nova_compute[183403]: 2026-01-26 15:25:36.319 183407 DEBUG oslo_concurrency.lockutils [req-eee773ed-ccc2-4804-bbf5-e205de5306a3 req-da365e05-7903-4072-a29e-cd8ddd8441a7 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "21c0fdc0-71e1-403c-a34a-fce881dd6046-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:25:36 compute-1 nova_compute[183403]: 2026-01-26 15:25:36.319 183407 DEBUG nova.compute.manager [req-eee773ed-ccc2-4804-bbf5-e205de5306a3 req-da365e05-7903-4072-a29e-cd8ddd8441a7 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] No waiting events found dispatching network-vif-plugged-8a736c48-af6c-4bb5-89e7-34d3cd2654df pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:25:36 compute-1 nova_compute[183403]: 2026-01-26 15:25:36.319 183407 WARNING nova.compute.manager [req-eee773ed-ccc2-4804-bbf5-e205de5306a3 req-da365e05-7903-4072-a29e-cd8ddd8441a7 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Received unexpected event network-vif-plugged-8a736c48-af6c-4bb5-89e7-34d3cd2654df for instance with vm_state active and task_state migrating.
Jan 26 15:25:36 compute-1 nova_compute[183403]: 2026-01-26 15:25:36.319 183407 DEBUG nova.compute.manager [req-eee773ed-ccc2-4804-bbf5-e205de5306a3 req-da365e05-7903-4072-a29e-cd8ddd8441a7 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Received event network-vif-plugged-8a736c48-af6c-4bb5-89e7-34d3cd2654df external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:25:36 compute-1 nova_compute[183403]: 2026-01-26 15:25:36.320 183407 DEBUG oslo_concurrency.lockutils [req-eee773ed-ccc2-4804-bbf5-e205de5306a3 req-da365e05-7903-4072-a29e-cd8ddd8441a7 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "21c0fdc0-71e1-403c-a34a-fce881dd6046-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:25:36 compute-1 nova_compute[183403]: 2026-01-26 15:25:36.320 183407 DEBUG oslo_concurrency.lockutils [req-eee773ed-ccc2-4804-bbf5-e205de5306a3 req-da365e05-7903-4072-a29e-cd8ddd8441a7 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "21c0fdc0-71e1-403c-a34a-fce881dd6046-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:25:36 compute-1 nova_compute[183403]: 2026-01-26 15:25:36.320 183407 DEBUG oslo_concurrency.lockutils [req-eee773ed-ccc2-4804-bbf5-e205de5306a3 req-da365e05-7903-4072-a29e-cd8ddd8441a7 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "21c0fdc0-71e1-403c-a34a-fce881dd6046-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:25:36 compute-1 nova_compute[183403]: 2026-01-26 15:25:36.320 183407 DEBUG nova.compute.manager [req-eee773ed-ccc2-4804-bbf5-e205de5306a3 req-da365e05-7903-4072-a29e-cd8ddd8441a7 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] No waiting events found dispatching network-vif-plugged-8a736c48-af6c-4bb5-89e7-34d3cd2654df pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:25:36 compute-1 nova_compute[183403]: 2026-01-26 15:25:36.321 183407 WARNING nova.compute.manager [req-eee773ed-ccc2-4804-bbf5-e205de5306a3 req-da365e05-7903-4072-a29e-cd8ddd8441a7 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Received unexpected event network-vif-plugged-8a736c48-af6c-4bb5-89e7-34d3cd2654df for instance with vm_state active and task_state migrating.
Jan 26 15:25:39 compute-1 nova_compute[183403]: 2026-01-26 15:25:39.599 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:25:40 compute-1 nova_compute[183403]: 2026-01-26 15:25:40.748 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:25:44 compute-1 nova_compute[183403]: 2026-01-26 15:25:44.601 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:25:44 compute-1 nova_compute[183403]: 2026-01-26 15:25:44.611 183407 DEBUG oslo_concurrency.lockutils [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "21c0fdc0-71e1-403c-a34a-fce881dd6046-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:25:44 compute-1 nova_compute[183403]: 2026-01-26 15:25:44.611 183407 DEBUG oslo_concurrency.lockutils [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "21c0fdc0-71e1-403c-a34a-fce881dd6046-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:25:44 compute-1 nova_compute[183403]: 2026-01-26 15:25:44.612 183407 DEBUG oslo_concurrency.lockutils [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "21c0fdc0-71e1-403c-a34a-fce881dd6046-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:25:45 compute-1 nova_compute[183403]: 2026-01-26 15:25:45.126 183407 DEBUG oslo_concurrency.lockutils [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:25:45 compute-1 nova_compute[183403]: 2026-01-26 15:25:45.126 183407 DEBUG oslo_concurrency.lockutils [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:25:45 compute-1 nova_compute[183403]: 2026-01-26 15:25:45.127 183407 DEBUG oslo_concurrency.lockutils [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:25:45 compute-1 nova_compute[183403]: 2026-01-26 15:25:45.127 183407 DEBUG nova.compute.resource_tracker [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:25:45 compute-1 podman[210897]: 2026-01-26 15:25:45.256864678 +0000 UTC m=+0.075383545 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 15:25:45 compute-1 podman[210898]: 2026-01-26 15:25:45.299949469 +0000 UTC m=+0.124007124 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, name=ubi9-minimal, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, maintainer=Red Hat, Inc., release=1755695350, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 26 15:25:45 compute-1 nova_compute[183403]: 2026-01-26 15:25:45.347 183407 WARNING nova.virt.libvirt.driver [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:25:45 compute-1 nova_compute[183403]: 2026-01-26 15:25:45.348 183407 DEBUG oslo_concurrency.processutils [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:25:45 compute-1 nova_compute[183403]: 2026-01-26 15:25:45.385 183407 DEBUG oslo_concurrency.processutils [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "env LANG=C uptime" returned: 0 in 0.037s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:25:45 compute-1 nova_compute[183403]: 2026-01-26 15:25:45.386 183407 DEBUG nova.compute.resource_tracker [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5827MB free_disk=73.14460754394531GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:25:45 compute-1 nova_compute[183403]: 2026-01-26 15:25:45.386 183407 DEBUG oslo_concurrency.lockutils [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:25:45 compute-1 nova_compute[183403]: 2026-01-26 15:25:45.387 183407 DEBUG oslo_concurrency.lockutils [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:25:45 compute-1 nova_compute[183403]: 2026-01-26 15:25:45.750 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:25:46 compute-1 nova_compute[183403]: 2026-01-26 15:25:46.410 183407 DEBUG nova.compute.resource_tracker [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Migration for instance 21c0fdc0-71e1-403c-a34a-fce881dd6046 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Jan 26 15:25:47 compute-1 nova_compute[183403]: 2026-01-26 15:25:47.152 183407 DEBUG nova.compute.resource_tracker [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Jan 26 15:25:47 compute-1 nova_compute[183403]: 2026-01-26 15:25:47.186 183407 DEBUG nova.compute.resource_tracker [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Migration 44b63b5a-98fd-48ba-a9c6-c9cd639f47bb is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Jan 26 15:25:47 compute-1 nova_compute[183403]: 2026-01-26 15:25:47.187 183407 DEBUG nova.compute.resource_tracker [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:25:47 compute-1 nova_compute[183403]: 2026-01-26 15:25:47.187 183407 DEBUG nova.compute.resource_tracker [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:25:45 up  1:21,  0 user,  load average: 0.32, 0.28, 0.29\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:25:47 compute-1 nova_compute[183403]: 2026-01-26 15:25:47.222 183407 DEBUG nova.compute.provider_tree [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:25:47 compute-1 nova_compute[183403]: 2026-01-26 15:25:47.730 183407 DEBUG nova.scheduler.client.report [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:25:48 compute-1 nova_compute[183403]: 2026-01-26 15:25:48.247 183407 DEBUG nova.compute.resource_tracker [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:25:48 compute-1 nova_compute[183403]: 2026-01-26 15:25:48.248 183407 DEBUG oslo_concurrency.lockutils [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.861s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:25:48 compute-1 nova_compute[183403]: 2026-01-26 15:25:48.270 183407 INFO nova.compute.manager [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Migrating instance to compute-0.ctlplane.example.com finished successfully.
Jan 26 15:25:49 compute-1 nova_compute[183403]: 2026-01-26 15:25:49.347 183407 INFO nova.scheduler.client.report [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Deleted allocation for migration 44b63b5a-98fd-48ba-a9c6-c9cd639f47bb
Jan 26 15:25:49 compute-1 nova_compute[183403]: 2026-01-26 15:25:49.348 183407 DEBUG nova.virt.libvirt.driver [None req-acb5f05e-11b5-4cae-963d-d53df6c74714 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 21c0fdc0-71e1-403c-a34a-fce881dd6046] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Jan 26 15:25:49 compute-1 openstack_network_exporter[195610]: ERROR   15:25:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:25:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:25:49 compute-1 openstack_network_exporter[195610]: ERROR   15:25:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:25:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:25:49 compute-1 nova_compute[183403]: 2026-01-26 15:25:49.643 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:25:50 compute-1 nova_compute[183403]: 2026-01-26 15:25:50.753 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:25:54 compute-1 nova_compute[183403]: 2026-01-26 15:25:54.645 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:25:54 compute-1 podman[210945]: 2026-01-26 15:25:54.90343091 +0000 UTC m=+0.068077423 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Jan 26 15:25:54 compute-1 podman[210944]: 2026-01-26 15:25:54.973216914 +0000 UTC m=+0.136155704 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 15:25:55 compute-1 nova_compute[183403]: 2026-01-26 15:25:55.755 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:25:59 compute-1 nova_compute[183403]: 2026-01-26 15:25:59.674 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:26:00 compute-1 nova_compute[183403]: 2026-01-26 15:26:00.757 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:26:04 compute-1 nova_compute[183403]: 2026-01-26 15:26:04.724 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:26:05 compute-1 podman[192725]: time="2026-01-26T15:26:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:26:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:26:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:26:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:26:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2191 "" "Go-http-client/1.1"
Jan 26 15:26:05 compute-1 nova_compute[183403]: 2026-01-26 15:26:05.796 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:26:09 compute-1 nova_compute[183403]: 2026-01-26 15:26:09.726 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:26:10 compute-1 nova_compute[183403]: 2026-01-26 15:26:10.300 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:26:10 compute-1 nova_compute[183403]: 2026-01-26 15:26:10.841 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:26:13 compute-1 nova_compute[183403]: 2026-01-26 15:26:13.578 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:26:14 compute-1 nova_compute[183403]: 2026-01-26 15:26:14.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:26:14 compute-1 nova_compute[183403]: 2026-01-26 15:26:14.729 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:26:15 compute-1 sshd-session[210989]: Invalid user admin from 45.148.10.121 port 38886
Jan 26 15:26:15 compute-1 sshd-session[210989]: Connection closed by invalid user admin 45.148.10.121 port 38886 [preauth]
Jan 26 15:26:15 compute-1 podman[210991]: 2026-01-26 15:26:15.693047937 +0000 UTC m=+0.083674785 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 26 15:26:15 compute-1 podman[210992]: 2026-01-26 15:26:15.71644384 +0000 UTC m=+0.088215077 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, name=ubi9-minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.buildah.version=1.33.7, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 26 15:26:15 compute-1 nova_compute[183403]: 2026-01-26 15:26:15.844 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:26:17 compute-1 nova_compute[183403]: 2026-01-26 15:26:17.577 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:26:18 compute-1 nova_compute[183403]: 2026-01-26 15:26:18.090 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:26:18 compute-1 nova_compute[183403]: 2026-01-26 15:26:18.090 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:26:18 compute-1 nova_compute[183403]: 2026-01-26 15:26:18.091 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:26:18 compute-1 nova_compute[183403]: 2026-01-26 15:26:18.091 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:26:18 compute-1 nova_compute[183403]: 2026-01-26 15:26:18.275 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:26:18 compute-1 nova_compute[183403]: 2026-01-26 15:26:18.277 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:26:18 compute-1 nova_compute[183403]: 2026-01-26 15:26:18.300 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.022s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:26:18 compute-1 nova_compute[183403]: 2026-01-26 15:26:18.300 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5834MB free_disk=73.14460754394531GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:26:18 compute-1 nova_compute[183403]: 2026-01-26 15:26:18.301 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:26:18 compute-1 nova_compute[183403]: 2026-01-26 15:26:18.301 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:26:19 compute-1 nova_compute[183403]: 2026-01-26 15:26:19.361 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:26:19 compute-1 nova_compute[183403]: 2026-01-26 15:26:19.361 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:26:18 up  1:21,  0 user,  load average: 0.20, 0.25, 0.28\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:26:19 compute-1 nova_compute[183403]: 2026-01-26 15:26:19.388 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:26:19 compute-1 openstack_network_exporter[195610]: ERROR   15:26:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:26:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:26:19 compute-1 openstack_network_exporter[195610]: ERROR   15:26:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:26:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:26:19 compute-1 nova_compute[183403]: 2026-01-26 15:26:19.732 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:26:19 compute-1 nova_compute[183403]: 2026-01-26 15:26:19.897 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:26:20 compute-1 nova_compute[183403]: 2026-01-26 15:26:20.410 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:26:20 compute-1 nova_compute[183403]: 2026-01-26 15:26:20.410 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.109s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:26:20 compute-1 nova_compute[183403]: 2026-01-26 15:26:20.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:26:20 compute-1 nova_compute[183403]: 2026-01-26 15:26:20.577 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 15:26:20 compute-1 nova_compute[183403]: 2026-01-26 15:26:20.577 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:26:20 compute-1 nova_compute[183403]: 2026-01-26 15:26:20.577 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Jan 26 15:26:20 compute-1 nova_compute[183403]: 2026-01-26 15:26:20.845 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:26:21 compute-1 nova_compute[183403]: 2026-01-26 15:26:21.085 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Jan 26 15:26:22 compute-1 nova_compute[183403]: 2026-01-26 15:26:22.080 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:26:22 compute-1 nova_compute[183403]: 2026-01-26 15:26:22.081 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:26:22 compute-1 nova_compute[183403]: 2026-01-26 15:26:22.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:26:23 compute-1 nova_compute[183403]: 2026-01-26 15:26:23.577 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:26:24 compute-1 nova_compute[183403]: 2026-01-26 15:26:24.734 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:26:25 compute-1 nova_compute[183403]: 2026-01-26 15:26:25.847 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:26:25 compute-1 podman[211038]: 2026-01-26 15:26:25.903366722 +0000 UTC m=+0.075886935 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 15:26:25 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:26:25.916 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:25:f3:ea 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-567e8645-0094-48f0-9603-67223f9e4c7a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-567e8645-0094-48f0-9603-67223f9e4c7a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4997f838a71499eb0b82dabfe381bfe', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3c086ead-9989-49f1-93e0-00527766eebe, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=3d7320e6-f866-41a6-98db-038f98178a52) old=Port_Binding(mac=['fa:16:3e:25:f3:ea'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-567e8645-0094-48f0-9603-67223f9e4c7a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-567e8645-0094-48f0-9603-67223f9e4c7a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4997f838a71499eb0b82dabfe381bfe', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:26:25 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:26:25.918 104930 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 3d7320e6-f866-41a6-98db-038f98178a52 in datapath 567e8645-0094-48f0-9603-67223f9e4c7a updated
Jan 26 15:26:25 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:26:25.918 104930 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 567e8645-0094-48f0-9603-67223f9e4c7a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 15:26:25 compute-1 podman[211037]: 2026-01-26 15:26:25.919243971 +0000 UTC m=+0.102967207 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4)
Jan 26 15:26:25 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:26:25.921 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[53cc46f7-f60b-4809-af26-1fa0a0a49674]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:26:26 compute-1 nova_compute[183403]: 2026-01-26 15:26:26.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:26:26 compute-1 nova_compute[183403]: 2026-01-26 15:26:26.577 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Jan 26 15:26:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:26:29.072 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:26:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:26:29.073 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:26:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:26:29.073 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:26:29 compute-1 nova_compute[183403]: 2026-01-26 15:26:29.735 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:26:30 compute-1 nova_compute[183403]: 2026-01-26 15:26:30.082 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:26:30 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:26:30.252 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:ac:38', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'de:96:40:90:f3:e5'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:26:30 compute-1 nova_compute[183403]: 2026-01-26 15:26:30.253 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:26:30 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:26:30.253 104930 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 15:26:30 compute-1 nova_compute[183403]: 2026-01-26 15:26:30.888 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:26:34 compute-1 nova_compute[183403]: 2026-01-26 15:26:34.770 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:26:35 compute-1 podman[192725]: time="2026-01-26T15:26:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:26:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:26:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:26:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:26:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2189 "" "Go-http-client/1.1"
Jan 26 15:26:35 compute-1 nova_compute[183403]: 2026-01-26 15:26:35.918 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:26:36 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:26:36.598 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:32:9e:28 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-af0f971c-6645-455c-a3ca-07b0a110d95d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-af0f971c-6645-455c-a3ca-07b0a110d95d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ab5cf25b2abc42399ccb7131f5e1e913', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=58f245d4-9e48-4647-8476-5786b75aa109, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=d38bc340-7ca8-49a4-8d8d-504f0e8c1d45) old=Port_Binding(mac=['fa:16:3e:32:9e:28'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-af0f971c-6645-455c-a3ca-07b0a110d95d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-af0f971c-6645-455c-a3ca-07b0a110d95d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ab5cf25b2abc42399ccb7131f5e1e913', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:26:36 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:26:36.600 104930 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port d38bc340-7ca8-49a4-8d8d-504f0e8c1d45 in datapath af0f971c-6645-455c-a3ca-07b0a110d95d updated
Jan 26 15:26:36 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:26:36.601 104930 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network af0f971c-6645-455c-a3ca-07b0a110d95d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 15:26:36 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:26:36.602 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[22d60f92-6ba0-4c49-9a54-55f99052c003]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:26:38 compute-1 nova_compute[183403]: 2026-01-26 15:26:38.186 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:26:39 compute-1 nova_compute[183403]: 2026-01-26 15:26:39.772 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:26:40 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:26:40.255 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=41380b5a-e321-4ce4-bcc6-ecd563b3c793, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:26:40 compute-1 nova_compute[183403]: 2026-01-26 15:26:40.920 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:26:42 compute-1 nova_compute[183403]: 2026-01-26 15:26:42.577 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:26:44 compute-1 nova_compute[183403]: 2026-01-26 15:26:44.775 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:26:45 compute-1 podman[211085]: 2026-01-26 15:26:45.908377089 +0000 UTC m=+0.078172574 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 15:26:45 compute-1 nova_compute[183403]: 2026-01-26 15:26:45.923 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:26:45 compute-1 podman[211086]: 2026-01-26 15:26:45.940511149 +0000 UTC m=+0.104603540 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, release=1755695350, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 26 15:26:46 compute-1 ovn_controller[95641]: 2026-01-26T15:26:46Z|00162|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Jan 26 15:26:49 compute-1 openstack_network_exporter[195610]: ERROR   15:26:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:26:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:26:49 compute-1 openstack_network_exporter[195610]: ERROR   15:26:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:26:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:26:49 compute-1 nova_compute[183403]: 2026-01-26 15:26:49.779 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:26:50 compute-1 nova_compute[183403]: 2026-01-26 15:26:50.926 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:26:53 compute-1 nova_compute[183403]: 2026-01-26 15:26:53.023 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:26:54 compute-1 nova_compute[183403]: 2026-01-26 15:26:54.785 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:26:55 compute-1 nova_compute[183403]: 2026-01-26 15:26:55.928 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:26:56 compute-1 podman[211130]: 2026-01-26 15:26:56.962909001 +0000 UTC m=+0.121909298 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 15:26:56 compute-1 podman[211131]: 2026-01-26 15:26:56.963801105 +0000 UTC m=+0.113962404 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Jan 26 15:26:59 compute-1 nova_compute[183403]: 2026-01-26 15:26:59.788 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:27:00 compute-1 nova_compute[183403]: 2026-01-26 15:27:00.930 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:27:04 compute-1 nova_compute[183403]: 2026-01-26 15:27:04.790 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:27:05 compute-1 podman[192725]: time="2026-01-26T15:27:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:27:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:27:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:27:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:27:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2191 "" "Go-http-client/1.1"
Jan 26 15:27:05 compute-1 nova_compute[183403]: 2026-01-26 15:27:05.931 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:27:06 compute-1 nova_compute[183403]: 2026-01-26 15:27:06.697 183407 DEBUG oslo_concurrency.lockutils [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] Acquiring lock "1d1f8313-3681-4a88-9aef-3c69f49aaa19" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:27:06 compute-1 nova_compute[183403]: 2026-01-26 15:27:06.698 183407 DEBUG oslo_concurrency.lockutils [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] Lock "1d1f8313-3681-4a88-9aef-3c69f49aaa19" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:27:07 compute-1 nova_compute[183403]: 2026-01-26 15:27:07.206 183407 DEBUG nova.compute.manager [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Jan 26 15:27:07 compute-1 nova_compute[183403]: 2026-01-26 15:27:07.759 183407 DEBUG oslo_concurrency.lockutils [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:27:07 compute-1 nova_compute[183403]: 2026-01-26 15:27:07.759 183407 DEBUG oslo_concurrency.lockutils [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:27:07 compute-1 nova_compute[183403]: 2026-01-26 15:27:07.769 183407 DEBUG nova.virt.hardware [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Jan 26 15:27:07 compute-1 nova_compute[183403]: 2026-01-26 15:27:07.769 183407 INFO nova.compute.claims [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Claim successful on node compute-1.ctlplane.example.com
Jan 26 15:27:08 compute-1 nova_compute[183403]: 2026-01-26 15:27:08.826 183407 DEBUG nova.compute.provider_tree [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:27:09 compute-1 nova_compute[183403]: 2026-01-26 15:27:09.336 183407 DEBUG nova.scheduler.client.report [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:27:09 compute-1 nova_compute[183403]: 2026-01-26 15:27:09.792 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:27:09 compute-1 nova_compute[183403]: 2026-01-26 15:27:09.848 183407 DEBUG oslo_concurrency.lockutils [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.089s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:27:09 compute-1 nova_compute[183403]: 2026-01-26 15:27:09.849 183407 DEBUG nova.compute.manager [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Jan 26 15:27:10 compute-1 nova_compute[183403]: 2026-01-26 15:27:10.362 183407 DEBUG nova.compute.manager [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Jan 26 15:27:10 compute-1 nova_compute[183403]: 2026-01-26 15:27:10.363 183407 DEBUG nova.network.neutron [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Jan 26 15:27:10 compute-1 nova_compute[183403]: 2026-01-26 15:27:10.363 183407 WARNING neutronclient.v2_0.client [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:27:10 compute-1 nova_compute[183403]: 2026-01-26 15:27:10.364 183407 WARNING neutronclient.v2_0.client [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:27:10 compute-1 nova_compute[183403]: 2026-01-26 15:27:10.873 183407 INFO nova.virt.libvirt.driver [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 15:27:10 compute-1 nova_compute[183403]: 2026-01-26 15:27:10.933 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:27:11 compute-1 nova_compute[183403]: 2026-01-26 15:27:11.382 183407 DEBUG nova.compute.manager [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Jan 26 15:27:12 compute-1 nova_compute[183403]: 2026-01-26 15:27:12.065 183407 DEBUG nova.network.neutron [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Successfully created port: 0cda9ffa-32fa-4511-a584-31b46b813df6 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Jan 26 15:27:12 compute-1 nova_compute[183403]: 2026-01-26 15:27:12.409 183407 DEBUG nova.compute.manager [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Jan 26 15:27:12 compute-1 nova_compute[183403]: 2026-01-26 15:27:12.410 183407 DEBUG nova.virt.libvirt.driver [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Jan 26 15:27:12 compute-1 nova_compute[183403]: 2026-01-26 15:27:12.411 183407 INFO nova.virt.libvirt.driver [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Creating image(s)
Jan 26 15:27:12 compute-1 nova_compute[183403]: 2026-01-26 15:27:12.411 183407 DEBUG oslo_concurrency.lockutils [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] Acquiring lock "/var/lib/nova/instances/1d1f8313-3681-4a88-9aef-3c69f49aaa19/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:27:12 compute-1 nova_compute[183403]: 2026-01-26 15:27:12.412 183407 DEBUG oslo_concurrency.lockutils [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] Lock "/var/lib/nova/instances/1d1f8313-3681-4a88-9aef-3c69f49aaa19/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:27:12 compute-1 nova_compute[183403]: 2026-01-26 15:27:12.413 183407 DEBUG oslo_concurrency.lockutils [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] Lock "/var/lib/nova/instances/1d1f8313-3681-4a88-9aef-3c69f49aaa19/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:27:12 compute-1 nova_compute[183403]: 2026-01-26 15:27:12.415 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:27:12 compute-1 nova_compute[183403]: 2026-01-26 15:27:12.422 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:27:12 compute-1 nova_compute[183403]: 2026-01-26 15:27:12.430 183407 DEBUG oslo_concurrency.processutils [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:27:12 compute-1 nova_compute[183403]: 2026-01-26 15:27:12.510 183407 DEBUG oslo_concurrency.processutils [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:27:12 compute-1 nova_compute[183403]: 2026-01-26 15:27:12.512 183407 DEBUG oslo_concurrency.lockutils [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] Acquiring lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:27:12 compute-1 nova_compute[183403]: 2026-01-26 15:27:12.513 183407 DEBUG oslo_concurrency.lockutils [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] Lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:27:12 compute-1 nova_compute[183403]: 2026-01-26 15:27:12.514 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:27:12 compute-1 nova_compute[183403]: 2026-01-26 15:27:12.520 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:27:12 compute-1 nova_compute[183403]: 2026-01-26 15:27:12.521 183407 DEBUG oslo_concurrency.processutils [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:27:12 compute-1 nova_compute[183403]: 2026-01-26 15:27:12.588 183407 DEBUG oslo_concurrency.processutils [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:27:12 compute-1 nova_compute[183403]: 2026-01-26 15:27:12.589 183407 DEBUG oslo_concurrency.processutils [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0,backing_fmt=raw /var/lib/nova/instances/1d1f8313-3681-4a88-9aef-3c69f49aaa19/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:27:12 compute-1 nova_compute[183403]: 2026-01-26 15:27:12.631 183407 DEBUG oslo_concurrency.processutils [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0,backing_fmt=raw /var/lib/nova/instances/1d1f8313-3681-4a88-9aef-3c69f49aaa19/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:27:12 compute-1 nova_compute[183403]: 2026-01-26 15:27:12.633 183407 DEBUG oslo_concurrency.lockutils [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] Lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.120s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:27:12 compute-1 nova_compute[183403]: 2026-01-26 15:27:12.633 183407 DEBUG oslo_concurrency.processutils [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:27:12 compute-1 nova_compute[183403]: 2026-01-26 15:27:12.717 183407 DEBUG oslo_concurrency.processutils [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:27:12 compute-1 nova_compute[183403]: 2026-01-26 15:27:12.718 183407 DEBUG nova.virt.disk.api [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] Checking if we can resize image /var/lib/nova/instances/1d1f8313-3681-4a88-9aef-3c69f49aaa19/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 26 15:27:12 compute-1 nova_compute[183403]: 2026-01-26 15:27:12.718 183407 DEBUG oslo_concurrency.processutils [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1d1f8313-3681-4a88-9aef-3c69f49aaa19/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:27:12 compute-1 nova_compute[183403]: 2026-01-26 15:27:12.781 183407 DEBUG oslo_concurrency.processutils [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1d1f8313-3681-4a88-9aef-3c69f49aaa19/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:27:12 compute-1 nova_compute[183403]: 2026-01-26 15:27:12.782 183407 DEBUG nova.virt.disk.api [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] Cannot resize image /var/lib/nova/instances/1d1f8313-3681-4a88-9aef-3c69f49aaa19/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 26 15:27:12 compute-1 nova_compute[183403]: 2026-01-26 15:27:12.783 183407 DEBUG nova.virt.libvirt.driver [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Jan 26 15:27:12 compute-1 nova_compute[183403]: 2026-01-26 15:27:12.783 183407 DEBUG nova.virt.libvirt.driver [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Ensure instance console log exists: /var/lib/nova/instances/1d1f8313-3681-4a88-9aef-3c69f49aaa19/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Jan 26 15:27:12 compute-1 nova_compute[183403]: 2026-01-26 15:27:12.784 183407 DEBUG oslo_concurrency.lockutils [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:27:12 compute-1 nova_compute[183403]: 2026-01-26 15:27:12.784 183407 DEBUG oslo_concurrency.lockutils [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:27:12 compute-1 nova_compute[183403]: 2026-01-26 15:27:12.784 183407 DEBUG oslo_concurrency.lockutils [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:27:13 compute-1 nova_compute[183403]: 2026-01-26 15:27:13.654 183407 DEBUG nova.network.neutron [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Successfully updated port: 0cda9ffa-32fa-4511-a584-31b46b813df6 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Jan 26 15:27:13 compute-1 nova_compute[183403]: 2026-01-26 15:27:13.720 183407 DEBUG nova.compute.manager [req-2c0b270e-6789-475b-a50c-6183dd2064f7 req-d151f171-bc1c-4974-a80c-8042679679ea 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Received event network-changed-0cda9ffa-32fa-4511-a584-31b46b813df6 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:27:13 compute-1 nova_compute[183403]: 2026-01-26 15:27:13.721 183407 DEBUG nova.compute.manager [req-2c0b270e-6789-475b-a50c-6183dd2064f7 req-d151f171-bc1c-4974-a80c-8042679679ea 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Refreshing instance network info cache due to event network-changed-0cda9ffa-32fa-4511-a584-31b46b813df6. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 26 15:27:13 compute-1 nova_compute[183403]: 2026-01-26 15:27:13.722 183407 DEBUG oslo_concurrency.lockutils [req-2c0b270e-6789-475b-a50c-6183dd2064f7 req-d151f171-bc1c-4974-a80c-8042679679ea 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "refresh_cache-1d1f8313-3681-4a88-9aef-3c69f49aaa19" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 15:27:13 compute-1 nova_compute[183403]: 2026-01-26 15:27:13.723 183407 DEBUG oslo_concurrency.lockutils [req-2c0b270e-6789-475b-a50c-6183dd2064f7 req-d151f171-bc1c-4974-a80c-8042679679ea 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquired lock "refresh_cache-1d1f8313-3681-4a88-9aef-3c69f49aaa19" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 15:27:13 compute-1 nova_compute[183403]: 2026-01-26 15:27:13.723 183407 DEBUG nova.network.neutron [req-2c0b270e-6789-475b-a50c-6183dd2064f7 req-d151f171-bc1c-4974-a80c-8042679679ea 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Refreshing network info cache for port 0cda9ffa-32fa-4511-a584-31b46b813df6 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 26 15:27:14 compute-1 nova_compute[183403]: 2026-01-26 15:27:14.162 183407 DEBUG oslo_concurrency.lockutils [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] Acquiring lock "refresh_cache-1d1f8313-3681-4a88-9aef-3c69f49aaa19" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 15:27:14 compute-1 nova_compute[183403]: 2026-01-26 15:27:14.232 183407 WARNING neutronclient.v2_0.client [req-2c0b270e-6789-475b-a50c-6183dd2064f7 req-d151f171-bc1c-4974-a80c-8042679679ea 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:27:14 compute-1 nova_compute[183403]: 2026-01-26 15:27:14.795 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:27:15 compute-1 nova_compute[183403]: 2026-01-26 15:27:15.009 183407 DEBUG nova.network.neutron [req-2c0b270e-6789-475b-a50c-6183dd2064f7 req-d151f171-bc1c-4974-a80c-8042679679ea 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 15:27:15 compute-1 nova_compute[183403]: 2026-01-26 15:27:15.091 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:27:15 compute-1 nova_compute[183403]: 2026-01-26 15:27:15.186 183407 DEBUG nova.network.neutron [req-2c0b270e-6789-475b-a50c-6183dd2064f7 req-d151f171-bc1c-4974-a80c-8042679679ea 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:27:15 compute-1 nova_compute[183403]: 2026-01-26 15:27:15.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:27:15 compute-1 nova_compute[183403]: 2026-01-26 15:27:15.692 183407 DEBUG oslo_concurrency.lockutils [req-2c0b270e-6789-475b-a50c-6183dd2064f7 req-d151f171-bc1c-4974-a80c-8042679679ea 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Releasing lock "refresh_cache-1d1f8313-3681-4a88-9aef-3c69f49aaa19" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 15:27:15 compute-1 nova_compute[183403]: 2026-01-26 15:27:15.693 183407 DEBUG oslo_concurrency.lockutils [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] Acquired lock "refresh_cache-1d1f8313-3681-4a88-9aef-3c69f49aaa19" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 15:27:15 compute-1 nova_compute[183403]: 2026-01-26 15:27:15.693 183407 DEBUG nova.network.neutron [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 15:27:15 compute-1 nova_compute[183403]: 2026-01-26 15:27:15.934 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:27:16 compute-1 nova_compute[183403]: 2026-01-26 15:27:16.696 183407 DEBUG nova.network.neutron [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 15:27:16 compute-1 podman[211187]: 2026-01-26 15:27:16.896476798 +0000 UTC m=+0.071349625 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 15:27:16 compute-1 podman[211188]: 2026-01-26 15:27:16.911000644 +0000 UTC m=+0.079551088 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=9.6, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, config_id=openstack_network_exporter, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container)
Jan 26 15:27:16 compute-1 nova_compute[183403]: 2026-01-26 15:27:16.923 183407 WARNING neutronclient.v2_0.client [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:27:18 compute-1 nova_compute[183403]: 2026-01-26 15:27:18.033 183407 DEBUG nova.network.neutron [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Updating instance_info_cache with network_info: [{"id": "0cda9ffa-32fa-4511-a584-31b46b813df6", "address": "fa:16:3e:e5:ee:5d", "network": {"id": "567e8645-0094-48f0-9603-67223f9e4c7a", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1901535032-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4997f838a71499eb0b82dabfe381bfe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0cda9ffa-32", "ovs_interfaceid": "0cda9ffa-32fa-4511-a584-31b46b813df6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:27:18 compute-1 nova_compute[183403]: 2026-01-26 15:27:18.548 183407 DEBUG oslo_concurrency.lockutils [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] Releasing lock "refresh_cache-1d1f8313-3681-4a88-9aef-3c69f49aaa19" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 15:27:18 compute-1 nova_compute[183403]: 2026-01-26 15:27:18.549 183407 DEBUG nova.compute.manager [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Instance network_info: |[{"id": "0cda9ffa-32fa-4511-a584-31b46b813df6", "address": "fa:16:3e:e5:ee:5d", "network": {"id": "567e8645-0094-48f0-9603-67223f9e4c7a", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1901535032-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4997f838a71499eb0b82dabfe381bfe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0cda9ffa-32", "ovs_interfaceid": "0cda9ffa-32fa-4511-a584-31b46b813df6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Jan 26 15:27:18 compute-1 nova_compute[183403]: 2026-01-26 15:27:18.553 183407 DEBUG nova.virt.libvirt.driver [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Start _get_guest_xml network_info=[{"id": "0cda9ffa-32fa-4511-a584-31b46b813df6", "address": "fa:16:3e:e5:ee:5d", "network": {"id": "567e8645-0094-48f0-9603-67223f9e4c7a", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1901535032-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4997f838a71499eb0b82dabfe381bfe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0cda9ffa-32", "ovs_interfaceid": "0cda9ffa-32fa-4511-a584-31b46b813df6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T15:01:22Z,direct_url=<?>,disk_format='qcow2',id=354e4d0e-4287-404f-93d3-2c85cfe92fbc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='179f3c996d8f4e7ea1b0aca3ec76f02e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T15:01:23Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'image_id': '354e4d0e-4287-404f-93d3-2c85cfe92fbc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Jan 26 15:27:18 compute-1 nova_compute[183403]: 2026-01-26 15:27:18.559 183407 WARNING nova.virt.libvirt.driver [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:27:18 compute-1 nova_compute[183403]: 2026-01-26 15:27:18.562 183407 DEBUG nova.virt.driver [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='354e4d0e-4287-404f-93d3-2c85cfe92fbc', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-1195854132', uuid='1d1f8313-3681-4a88-9aef-3c69f49aaa19'), owner=OwnerMeta(userid='ce151e26874c4c369f13ecc08f41d47f', username='tempest-TestExecuteVmWorkloadBalanceStrategy-1308243436-project-admin', projectid='ab5cf25b2abc42399ccb7131f5e1e913', projectname='tempest-TestExecuteVmWorkloadBalanceStrategy-1308243436'), image=ImageMeta(id='354e4d0e-4287-404f-93d3-2c85cfe92fbc', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='74480e15-23e6-4569-8ef9-3ddf5ac8b981', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "0cda9ffa-32fa-4511-a584-31b46b813df6", "address": "fa:16:3e:e5:ee:5d", "network": {"id": "567e8645-0094-48f0-9603-67223f9e4c7a", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1901535032-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4997f838a71499eb0b82dabfe381bfe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0cda9ffa-32", "ovs_interfaceid": "0cda9ffa-32fa-4511-a584-31b46b813df6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1769441238.5618796) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Jan 26 15:27:18 compute-1 nova_compute[183403]: 2026-01-26 15:27:18.567 183407 DEBUG nova.virt.libvirt.host [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Jan 26 15:27:18 compute-1 nova_compute[183403]: 2026-01-26 15:27:18.568 183407 DEBUG nova.virt.libvirt.host [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Jan 26 15:27:18 compute-1 nova_compute[183403]: 2026-01-26 15:27:18.571 183407 DEBUG nova.virt.libvirt.host [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Jan 26 15:27:18 compute-1 nova_compute[183403]: 2026-01-26 15:27:18.572 183407 DEBUG nova.virt.libvirt.host [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Jan 26 15:27:18 compute-1 nova_compute[183403]: 2026-01-26 15:27:18.573 183407 DEBUG nova.virt.libvirt.driver [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Jan 26 15:27:18 compute-1 nova_compute[183403]: 2026-01-26 15:27:18.573 183407 DEBUG nova.virt.hardware [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T15:01:21Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='74480e15-23e6-4569-8ef9-3ddf5ac8b981',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T15:01:22Z,direct_url=<?>,disk_format='qcow2',id=354e4d0e-4287-404f-93d3-2c85cfe92fbc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='179f3c996d8f4e7ea1b0aca3ec76f02e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T15:01:23Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Jan 26 15:27:18 compute-1 nova_compute[183403]: 2026-01-26 15:27:18.574 183407 DEBUG nova.virt.hardware [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Jan 26 15:27:18 compute-1 nova_compute[183403]: 2026-01-26 15:27:18.574 183407 DEBUG nova.virt.hardware [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Jan 26 15:27:18 compute-1 nova_compute[183403]: 2026-01-26 15:27:18.574 183407 DEBUG nova.virt.hardware [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Jan 26 15:27:18 compute-1 nova_compute[183403]: 2026-01-26 15:27:18.574 183407 DEBUG nova.virt.hardware [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Jan 26 15:27:18 compute-1 nova_compute[183403]: 2026-01-26 15:27:18.574 183407 DEBUG nova.virt.hardware [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Jan 26 15:27:18 compute-1 nova_compute[183403]: 2026-01-26 15:27:18.574 183407 DEBUG nova.virt.hardware [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Jan 26 15:27:18 compute-1 nova_compute[183403]: 2026-01-26 15:27:18.575 183407 DEBUG nova.virt.hardware [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Jan 26 15:27:18 compute-1 nova_compute[183403]: 2026-01-26 15:27:18.575 183407 DEBUG nova.virt.hardware [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Jan 26 15:27:18 compute-1 nova_compute[183403]: 2026-01-26 15:27:18.575 183407 DEBUG nova.virt.hardware [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Jan 26 15:27:18 compute-1 nova_compute[183403]: 2026-01-26 15:27:18.575 183407 DEBUG nova.virt.hardware [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Jan 26 15:27:18 compute-1 nova_compute[183403]: 2026-01-26 15:27:18.579 183407 DEBUG nova.virt.libvirt.vif [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-01-26T15:27:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-1195854132',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-1195854132',id=21,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ab5cf25b2abc42399ccb7131f5e1e913',ramdisk_id='',reservation_id='r-vqqlw3za',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,reader,member,admin',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1308243436',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1308243436-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T15:27:11Z,user_data=None,user_id='ce151e26874c4c369f13ecc08f41d47f',uuid=1d1f8313-3681-4a88-9aef-3c69f49aaa19,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0cda9ffa-32fa-4511-a584-31b46b813df6", "address": "fa:16:3e:e5:ee:5d", "network": {"id": "567e8645-0094-48f0-9603-67223f9e4c7a", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1901535032-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4997f838a71499eb0b82dabfe381bfe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0cda9ffa-32", "ovs_interfaceid": "0cda9ffa-32fa-4511-a584-31b46b813df6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 26 15:27:18 compute-1 nova_compute[183403]: 2026-01-26 15:27:18.579 183407 DEBUG nova.network.os_vif_util [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] Converting VIF {"id": "0cda9ffa-32fa-4511-a584-31b46b813df6", "address": "fa:16:3e:e5:ee:5d", "network": {"id": "567e8645-0094-48f0-9603-67223f9e4c7a", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1901535032-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4997f838a71499eb0b82dabfe381bfe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0cda9ffa-32", "ovs_interfaceid": "0cda9ffa-32fa-4511-a584-31b46b813df6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:27:18 compute-1 nova_compute[183403]: 2026-01-26 15:27:18.580 183407 DEBUG nova.network.os_vif_util [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e5:ee:5d,bridge_name='br-int',has_traffic_filtering=True,id=0cda9ffa-32fa-4511-a584-31b46b813df6,network=Network(567e8645-0094-48f0-9603-67223f9e4c7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0cda9ffa-32') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:27:18 compute-1 nova_compute[183403]: 2026-01-26 15:27:18.580 183407 DEBUG nova.objects.instance [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1d1f8313-3681-4a88-9aef-3c69f49aaa19 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:27:19 compute-1 nova_compute[183403]: 2026-01-26 15:27:19.089 183407 DEBUG nova.virt.libvirt.driver [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] End _get_guest_xml xml=<domain type="kvm">
Jan 26 15:27:19 compute-1 nova_compute[183403]:   <uuid>1d1f8313-3681-4a88-9aef-3c69f49aaa19</uuid>
Jan 26 15:27:19 compute-1 nova_compute[183403]:   <name>instance-00000015</name>
Jan 26 15:27:19 compute-1 nova_compute[183403]:   <memory>131072</memory>
Jan 26 15:27:19 compute-1 nova_compute[183403]:   <vcpu>1</vcpu>
Jan 26 15:27:19 compute-1 nova_compute[183403]:   <metadata>
Jan 26 15:27:19 compute-1 nova_compute[183403]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 15:27:19 compute-1 nova_compute[183403]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 15:27:19 compute-1 nova_compute[183403]:       <nova:name>tempest-TestExecuteVmWorkloadBalanceStrategy-server-1195854132</nova:name>
Jan 26 15:27:19 compute-1 nova_compute[183403]:       <nova:creationTime>2026-01-26 15:27:18</nova:creationTime>
Jan 26 15:27:19 compute-1 nova_compute[183403]:       <nova:flavor name="m1.nano" id="74480e15-23e6-4569-8ef9-3ddf5ac8b981">
Jan 26 15:27:19 compute-1 nova_compute[183403]:         <nova:memory>128</nova:memory>
Jan 26 15:27:19 compute-1 nova_compute[183403]:         <nova:disk>1</nova:disk>
Jan 26 15:27:19 compute-1 nova_compute[183403]:         <nova:swap>0</nova:swap>
Jan 26 15:27:19 compute-1 nova_compute[183403]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 15:27:19 compute-1 nova_compute[183403]:         <nova:vcpus>1</nova:vcpus>
Jan 26 15:27:19 compute-1 nova_compute[183403]:         <nova:extraSpecs>
Jan 26 15:27:19 compute-1 nova_compute[183403]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 15:27:19 compute-1 nova_compute[183403]:         </nova:extraSpecs>
Jan 26 15:27:19 compute-1 nova_compute[183403]:       </nova:flavor>
Jan 26 15:27:19 compute-1 nova_compute[183403]:       <nova:image uuid="354e4d0e-4287-404f-93d3-2c85cfe92fbc">
Jan 26 15:27:19 compute-1 nova_compute[183403]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 15:27:19 compute-1 nova_compute[183403]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 15:27:19 compute-1 nova_compute[183403]:         <nova:minDisk>1</nova:minDisk>
Jan 26 15:27:19 compute-1 nova_compute[183403]:         <nova:minRam>0</nova:minRam>
Jan 26 15:27:19 compute-1 nova_compute[183403]:         <nova:properties>
Jan 26 15:27:19 compute-1 nova_compute[183403]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 15:27:19 compute-1 nova_compute[183403]:         </nova:properties>
Jan 26 15:27:19 compute-1 nova_compute[183403]:       </nova:image>
Jan 26 15:27:19 compute-1 nova_compute[183403]:       <nova:owner>
Jan 26 15:27:19 compute-1 nova_compute[183403]:         <nova:user uuid="ce151e26874c4c369f13ecc08f41d47f">tempest-TestExecuteVmWorkloadBalanceStrategy-1308243436-project-admin</nova:user>
Jan 26 15:27:19 compute-1 nova_compute[183403]:         <nova:project uuid="ab5cf25b2abc42399ccb7131f5e1e913">tempest-TestExecuteVmWorkloadBalanceStrategy-1308243436</nova:project>
Jan 26 15:27:19 compute-1 nova_compute[183403]:       </nova:owner>
Jan 26 15:27:19 compute-1 nova_compute[183403]:       <nova:root type="image" uuid="354e4d0e-4287-404f-93d3-2c85cfe92fbc"/>
Jan 26 15:27:19 compute-1 nova_compute[183403]:       <nova:ports>
Jan 26 15:27:19 compute-1 nova_compute[183403]:         <nova:port uuid="0cda9ffa-32fa-4511-a584-31b46b813df6">
Jan 26 15:27:19 compute-1 nova_compute[183403]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 26 15:27:19 compute-1 nova_compute[183403]:         </nova:port>
Jan 26 15:27:19 compute-1 nova_compute[183403]:       </nova:ports>
Jan 26 15:27:19 compute-1 nova_compute[183403]:     </nova:instance>
Jan 26 15:27:19 compute-1 nova_compute[183403]:   </metadata>
Jan 26 15:27:19 compute-1 nova_compute[183403]:   <sysinfo type="smbios">
Jan 26 15:27:19 compute-1 nova_compute[183403]:     <system>
Jan 26 15:27:19 compute-1 nova_compute[183403]:       <entry name="manufacturer">RDO</entry>
Jan 26 15:27:19 compute-1 nova_compute[183403]:       <entry name="product">OpenStack Compute</entry>
Jan 26 15:27:19 compute-1 nova_compute[183403]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 15:27:19 compute-1 nova_compute[183403]:       <entry name="serial">1d1f8313-3681-4a88-9aef-3c69f49aaa19</entry>
Jan 26 15:27:19 compute-1 nova_compute[183403]:       <entry name="uuid">1d1f8313-3681-4a88-9aef-3c69f49aaa19</entry>
Jan 26 15:27:19 compute-1 nova_compute[183403]:       <entry name="family">Virtual Machine</entry>
Jan 26 15:27:19 compute-1 nova_compute[183403]:     </system>
Jan 26 15:27:19 compute-1 nova_compute[183403]:   </sysinfo>
Jan 26 15:27:19 compute-1 nova_compute[183403]:   <os>
Jan 26 15:27:19 compute-1 nova_compute[183403]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 15:27:19 compute-1 nova_compute[183403]:     <boot dev="hd"/>
Jan 26 15:27:19 compute-1 nova_compute[183403]:     <smbios mode="sysinfo"/>
Jan 26 15:27:19 compute-1 nova_compute[183403]:   </os>
Jan 26 15:27:19 compute-1 nova_compute[183403]:   <features>
Jan 26 15:27:19 compute-1 nova_compute[183403]:     <acpi/>
Jan 26 15:27:19 compute-1 nova_compute[183403]:     <apic/>
Jan 26 15:27:19 compute-1 nova_compute[183403]:     <vmcoreinfo/>
Jan 26 15:27:19 compute-1 nova_compute[183403]:   </features>
Jan 26 15:27:19 compute-1 nova_compute[183403]:   <clock offset="utc">
Jan 26 15:27:19 compute-1 nova_compute[183403]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 15:27:19 compute-1 nova_compute[183403]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 15:27:19 compute-1 nova_compute[183403]:     <timer name="hpet" present="no"/>
Jan 26 15:27:19 compute-1 nova_compute[183403]:   </clock>
Jan 26 15:27:19 compute-1 nova_compute[183403]:   <cpu mode="custom" match="exact">
Jan 26 15:27:19 compute-1 nova_compute[183403]:     <model>Nehalem</model>
Jan 26 15:27:19 compute-1 nova_compute[183403]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 15:27:19 compute-1 nova_compute[183403]:   </cpu>
Jan 26 15:27:19 compute-1 nova_compute[183403]:   <devices>
Jan 26 15:27:19 compute-1 nova_compute[183403]:     <disk type="file" device="disk">
Jan 26 15:27:19 compute-1 nova_compute[183403]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 15:27:19 compute-1 nova_compute[183403]:       <source file="/var/lib/nova/instances/1d1f8313-3681-4a88-9aef-3c69f49aaa19/disk"/>
Jan 26 15:27:19 compute-1 nova_compute[183403]:       <target dev="vda" bus="virtio"/>
Jan 26 15:27:19 compute-1 nova_compute[183403]:     </disk>
Jan 26 15:27:19 compute-1 nova_compute[183403]:     <disk type="file" device="cdrom">
Jan 26 15:27:19 compute-1 nova_compute[183403]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 15:27:19 compute-1 nova_compute[183403]:       <source file="/var/lib/nova/instances/1d1f8313-3681-4a88-9aef-3c69f49aaa19/disk.config"/>
Jan 26 15:27:19 compute-1 nova_compute[183403]:       <target dev="sda" bus="sata"/>
Jan 26 15:27:19 compute-1 nova_compute[183403]:     </disk>
Jan 26 15:27:19 compute-1 nova_compute[183403]:     <interface type="ethernet">
Jan 26 15:27:19 compute-1 nova_compute[183403]:       <mac address="fa:16:3e:e5:ee:5d"/>
Jan 26 15:27:19 compute-1 nova_compute[183403]:       <model type="virtio"/>
Jan 26 15:27:19 compute-1 nova_compute[183403]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 15:27:19 compute-1 nova_compute[183403]:       <mtu size="1442"/>
Jan 26 15:27:19 compute-1 nova_compute[183403]:       <target dev="tap0cda9ffa-32"/>
Jan 26 15:27:19 compute-1 nova_compute[183403]:     </interface>
Jan 26 15:27:19 compute-1 nova_compute[183403]:     <serial type="pty">
Jan 26 15:27:19 compute-1 nova_compute[183403]:       <log file="/var/lib/nova/instances/1d1f8313-3681-4a88-9aef-3c69f49aaa19/console.log" append="off"/>
Jan 26 15:27:19 compute-1 nova_compute[183403]:     </serial>
Jan 26 15:27:19 compute-1 nova_compute[183403]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 15:27:19 compute-1 nova_compute[183403]:     <video>
Jan 26 15:27:19 compute-1 nova_compute[183403]:       <model type="virtio"/>
Jan 26 15:27:19 compute-1 nova_compute[183403]:     </video>
Jan 26 15:27:19 compute-1 nova_compute[183403]:     <input type="tablet" bus="usb"/>
Jan 26 15:27:19 compute-1 nova_compute[183403]:     <rng model="virtio">
Jan 26 15:27:19 compute-1 nova_compute[183403]:       <backend model="random">/dev/urandom</backend>
Jan 26 15:27:19 compute-1 nova_compute[183403]:     </rng>
Jan 26 15:27:19 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root"/>
Jan 26 15:27:19 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:27:19 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:27:19 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:27:19 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:27:19 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:27:19 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:27:19 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:27:19 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:27:19 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:27:19 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:27:19 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:27:19 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:27:19 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:27:19 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:27:19 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:27:19 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:27:19 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:27:19 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:27:19 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:27:19 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:27:19 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:27:19 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:27:19 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:27:19 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:27:19 compute-1 nova_compute[183403]:     <controller type="usb" index="0"/>
Jan 26 15:27:19 compute-1 nova_compute[183403]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 15:27:19 compute-1 nova_compute[183403]:       <stats period="10"/>
Jan 26 15:27:19 compute-1 nova_compute[183403]:     </memballoon>
Jan 26 15:27:19 compute-1 nova_compute[183403]:   </devices>
Jan 26 15:27:19 compute-1 nova_compute[183403]: </domain>
Jan 26 15:27:19 compute-1 nova_compute[183403]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Jan 26 15:27:19 compute-1 nova_compute[183403]: 2026-01-26 15:27:19.091 183407 DEBUG nova.compute.manager [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Preparing to wait for external event network-vif-plugged-0cda9ffa-32fa-4511-a584-31b46b813df6 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 26 15:27:19 compute-1 nova_compute[183403]: 2026-01-26 15:27:19.092 183407 DEBUG oslo_concurrency.lockutils [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] Acquiring lock "1d1f8313-3681-4a88-9aef-3c69f49aaa19-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:27:19 compute-1 nova_compute[183403]: 2026-01-26 15:27:19.092 183407 DEBUG oslo_concurrency.lockutils [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] Lock "1d1f8313-3681-4a88-9aef-3c69f49aaa19-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:27:19 compute-1 nova_compute[183403]: 2026-01-26 15:27:19.093 183407 DEBUG oslo_concurrency.lockutils [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] Lock "1d1f8313-3681-4a88-9aef-3c69f49aaa19-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:27:19 compute-1 nova_compute[183403]: 2026-01-26 15:27:19.094 183407 DEBUG nova.virt.libvirt.vif [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-01-26T15:27:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-1195854132',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-1195854132',id=21,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ab5cf25b2abc42399ccb7131f5e1e913',ramdisk_id='',reservation_id='r-vqqlw3za',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,reader,member,admin',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1308243436',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1308243436-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T15:27:11Z,user_data=None,user_id='ce151e26874c4c369f13ecc08f41d47f',uuid=1d1f8313-3681-4a88-9aef-3c69f49aaa19,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0cda9ffa-32fa-4511-a584-31b46b813df6", "address": "fa:16:3e:e5:ee:5d", "network": {"id": "567e8645-0094-48f0-9603-67223f9e4c7a", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1901535032-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4997f838a71499eb0b82dabfe381bfe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0cda9ffa-32", "ovs_interfaceid": "0cda9ffa-32fa-4511-a584-31b46b813df6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 26 15:27:19 compute-1 nova_compute[183403]: 2026-01-26 15:27:19.095 183407 DEBUG nova.network.os_vif_util [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] Converting VIF {"id": "0cda9ffa-32fa-4511-a584-31b46b813df6", "address": "fa:16:3e:e5:ee:5d", "network": {"id": "567e8645-0094-48f0-9603-67223f9e4c7a", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1901535032-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4997f838a71499eb0b82dabfe381bfe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0cda9ffa-32", "ovs_interfaceid": "0cda9ffa-32fa-4511-a584-31b46b813df6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:27:19 compute-1 nova_compute[183403]: 2026-01-26 15:27:19.096 183407 DEBUG nova.network.os_vif_util [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e5:ee:5d,bridge_name='br-int',has_traffic_filtering=True,id=0cda9ffa-32fa-4511-a584-31b46b813df6,network=Network(567e8645-0094-48f0-9603-67223f9e4c7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0cda9ffa-32') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:27:19 compute-1 nova_compute[183403]: 2026-01-26 15:27:19.097 183407 DEBUG os_vif [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:ee:5d,bridge_name='br-int',has_traffic_filtering=True,id=0cda9ffa-32fa-4511-a584-31b46b813df6,network=Network(567e8645-0094-48f0-9603-67223f9e4c7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0cda9ffa-32') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 26 15:27:19 compute-1 nova_compute[183403]: 2026-01-26 15:27:19.098 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:27:19 compute-1 nova_compute[183403]: 2026-01-26 15:27:19.099 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:27:19 compute-1 nova_compute[183403]: 2026-01-26 15:27:19.099 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:27:19 compute-1 nova_compute[183403]: 2026-01-26 15:27:19.101 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:27:19 compute-1 nova_compute[183403]: 2026-01-26 15:27:19.101 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '730566b9-a28b-552d-aa47-4703b6e0a070', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:27:19 compute-1 nova_compute[183403]: 2026-01-26 15:27:19.103 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:27:19 compute-1 nova_compute[183403]: 2026-01-26 15:27:19.105 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:27:19 compute-1 nova_compute[183403]: 2026-01-26 15:27:19.114 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:27:19 compute-1 nova_compute[183403]: 2026-01-26 15:27:19.115 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0cda9ffa-32, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:27:19 compute-1 nova_compute[183403]: 2026-01-26 15:27:19.116 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap0cda9ffa-32, col_values=(('qos', UUID('0b66a896-4604-4e26-8ac0-2fadb72b4122')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:27:19 compute-1 nova_compute[183403]: 2026-01-26 15:27:19.120 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap0cda9ffa-32, col_values=(('external_ids', {'iface-id': '0cda9ffa-32fa-4511-a584-31b46b813df6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e5:ee:5d', 'vm-uuid': '1d1f8313-3681-4a88-9aef-3c69f49aaa19'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:27:19 compute-1 nova_compute[183403]: 2026-01-26 15:27:19.124 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:27:19 compute-1 NetworkManager[55716]: <info>  [1769441239.1271] manager: (tap0cda9ffa-32): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/63)
Jan 26 15:27:19 compute-1 nova_compute[183403]: 2026-01-26 15:27:19.127 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:27:19 compute-1 nova_compute[183403]: 2026-01-26 15:27:19.134 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:27:19 compute-1 nova_compute[183403]: 2026-01-26 15:27:19.135 183407 INFO os_vif [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:ee:5d,bridge_name='br-int',has_traffic_filtering=True,id=0cda9ffa-32fa-4511-a584-31b46b813df6,network=Network(567e8645-0094-48f0-9603-67223f9e4c7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0cda9ffa-32')
Jan 26 15:27:19 compute-1 openstack_network_exporter[195610]: ERROR   15:27:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:27:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:27:19 compute-1 openstack_network_exporter[195610]: ERROR   15:27:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:27:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:27:19 compute-1 nova_compute[183403]: 2026-01-26 15:27:19.575 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:27:20 compute-1 nova_compute[183403]: 2026-01-26 15:27:20.936 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:27:21 compute-1 nova_compute[183403]: 2026-01-26 15:27:21.401 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:27:21 compute-1 nova_compute[183403]: 2026-01-26 15:27:21.402 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:27:21 compute-1 nova_compute[183403]: 2026-01-26 15:27:21.402 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:27:21 compute-1 nova_compute[183403]: 2026-01-26 15:27:21.402 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:27:21 compute-1 nova_compute[183403]: 2026-01-26 15:27:21.918 183407 DEBUG nova.virt.libvirt.driver [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 15:27:21 compute-1 nova_compute[183403]: 2026-01-26 15:27:21.919 183407 DEBUG nova.virt.libvirt.driver [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 15:27:21 compute-1 nova_compute[183403]: 2026-01-26 15:27:21.919 183407 DEBUG nova.virt.libvirt.driver [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] No VIF found with MAC fa:16:3e:e5:ee:5d, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Jan 26 15:27:21 compute-1 nova_compute[183403]: 2026-01-26 15:27:21.920 183407 INFO nova.virt.libvirt.driver [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Using config drive
Jan 26 15:27:22 compute-1 nova_compute[183403]: 2026-01-26 15:27:22.557 183407 WARNING neutronclient.v2_0.client [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:27:22 compute-1 nova_compute[183403]: 2026-01-26 15:27:22.699 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1d1f8313-3681-4a88-9aef-3c69f49aaa19/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:27:22 compute-1 nova_compute[183403]: 2026-01-26 15:27:22.760 183407 INFO nova.virt.libvirt.driver [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Creating config drive at /var/lib/nova/instances/1d1f8313-3681-4a88-9aef-3c69f49aaa19/disk.config
Jan 26 15:27:22 compute-1 nova_compute[183403]: 2026-01-26 15:27:22.768 183407 DEBUG oslo_concurrency.processutils [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1d1f8313-3681-4a88-9aef-3c69f49aaa19/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp5sb0paok execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:27:22 compute-1 nova_compute[183403]: 2026-01-26 15:27:22.791 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1d1f8313-3681-4a88-9aef-3c69f49aaa19/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:27:22 compute-1 nova_compute[183403]: 2026-01-26 15:27:22.792 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1d1f8313-3681-4a88-9aef-3c69f49aaa19/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:27:22 compute-1 nova_compute[183403]: 2026-01-26 15:27:22.880 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1d1f8313-3681-4a88-9aef-3c69f49aaa19/disk --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:27:22 compute-1 nova_compute[183403]: 2026-01-26 15:27:22.911 183407 DEBUG oslo_concurrency.processutils [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1d1f8313-3681-4a88-9aef-3c69f49aaa19/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp5sb0paok" returned: 0 in 0.142s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:27:23 compute-1 kernel: tap0cda9ffa-32: entered promiscuous mode
Jan 26 15:27:23 compute-1 nova_compute[183403]: 2026-01-26 15:27:23.026 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:27:23 compute-1 NetworkManager[55716]: <info>  [1769441243.0309] manager: (tap0cda9ffa-32): new Tun device (/org/freedesktop/NetworkManager/Devices/64)
Jan 26 15:27:23 compute-1 ovn_controller[95641]: 2026-01-26T15:27:23Z|00163|binding|INFO|Claiming lport 0cda9ffa-32fa-4511-a584-31b46b813df6 for this chassis.
Jan 26 15:27:23 compute-1 ovn_controller[95641]: 2026-01-26T15:27:23Z|00164|binding|INFO|0cda9ffa-32fa-4511-a584-31b46b813df6: Claiming fa:16:3e:e5:ee:5d 10.100.0.12
Jan 26 15:27:23 compute-1 nova_compute[183403]: 2026-01-26 15:27:23.034 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:27:23 compute-1 nova_compute[183403]: 2026-01-26 15:27:23.042 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:27:23 compute-1 systemd-udevd[211256]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 15:27:23 compute-1 systemd-machined[154697]: New machine qemu-15-instance-00000015.
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:27:23.097 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e5:ee:5d 10.100.0.12'], port_security=['fa:16:3e:e5:ee:5d 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '1d1f8313-3681-4a88-9aef-3c69f49aaa19', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-567e8645-0094-48f0-9603-67223f9e4c7a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ab5cf25b2abc42399ccb7131f5e1e913', 'neutron:revision_number': '4', 'neutron:security_group_ids': '42fdb3ac-2f55-4285-975e-aaa7d141bc66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3c086ead-9989-49f1-93e0-00527766eebe, chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], logical_port=0cda9ffa-32fa-4511-a584-31b46b813df6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:27:23.100 104930 INFO neutron.agent.ovn.metadata.agent [-] Port 0cda9ffa-32fa-4511-a584-31b46b813df6 in datapath 567e8645-0094-48f0-9603-67223f9e4c7a bound to our chassis
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:27:23.101 104930 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 567e8645-0094-48f0-9603-67223f9e4c7a
Jan 26 15:27:23 compute-1 NetworkManager[55716]: <info>  [1769441243.1090] device (tap0cda9ffa-32): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 15:27:23 compute-1 NetworkManager[55716]: <info>  [1769441243.1115] device (tap0cda9ffa-32): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 15:27:23 compute-1 systemd[1]: Started Virtual Machine qemu-15-instance-00000015.
Jan 26 15:27:23 compute-1 nova_compute[183403]: 2026-01-26 15:27:23.119 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:27:23 compute-1 ovn_controller[95641]: 2026-01-26T15:27:23Z|00165|binding|INFO|Setting lport 0cda9ffa-32fa-4511-a584-31b46b813df6 ovn-installed in OVS
Jan 26 15:27:23 compute-1 ovn_controller[95641]: 2026-01-26T15:27:23Z|00166|binding|INFO|Setting lport 0cda9ffa-32fa-4511-a584-31b46b813df6 up in Southbound
Jan 26 15:27:23 compute-1 nova_compute[183403]: 2026-01-26 15:27:23.123 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:27:23.136 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[daecc800-4aa2-49ed-b90f-a29a2a37c92d]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:27:23.137 104930 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap567e8645-01 in ovnmeta-567e8645-0094-48f0-9603-67223f9e4c7a namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:27:23.140 203506 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap567e8645-00 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:27:23.140 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[d0838701-1c41-472d-9b9f-c75dc3a4b9cb]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:27:23.140 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[785fcb86-9a53-49d9-abb3-e8f58920a121]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:27:23.159 105448 DEBUG oslo.privsep.daemon [-] privsep: reply[c709daab-7280-45f5-9bde-ee80ca915539]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:27:23.168 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[1265b242-96b6-4555-90f1-32eee96a9160]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:27:23 compute-1 nova_compute[183403]: 2026-01-26 15:27:23.208 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:27:23 compute-1 nova_compute[183403]: 2026-01-26 15:27:23.210 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:27:23.211 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[c830ef2e-5485-4d27-8814-731b64f149d4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:27:23.216 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[1d1b5866-2878-4ccd-8c0e-f1d7a61dbcd8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:27:23 compute-1 NetworkManager[55716]: <info>  [1769441243.2182] manager: (tap567e8645-00): new Veth device (/org/freedesktop/NetworkManager/Devices/65)
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:27:23.256 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[000c6a18-7570-4c5f-868d-c563ba17bc51]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:27:23 compute-1 nova_compute[183403]: 2026-01-26 15:27:23.258 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.047s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:27:23 compute-1 nova_compute[183403]: 2026-01-26 15:27:23.258 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5830MB free_disk=73.14440536499023GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:27:23 compute-1 nova_compute[183403]: 2026-01-26 15:27:23.259 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:27:23 compute-1 nova_compute[183403]: 2026-01-26 15:27:23.259 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:27:23.260 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[371735c7-abda-4c6f-b1ac-2deea5a12b6c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:27:23 compute-1 NetworkManager[55716]: <info>  [1769441243.2872] device (tap567e8645-00): carrier: link connected
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:27:23.291 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[92647f7c-e742-4f60-8bf9-c093f362f392]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:27:23.311 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[4230d8ff-c347-4ea4-af19-57f99ccfa77f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap567e8645-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:25:f3:ea'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 47], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 496589, 'reachable_time': 41334, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211290, 'error': None, 'target': 'ovnmeta-567e8645-0094-48f0-9603-67223f9e4c7a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:27:23.328 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[24d1d307-5b05-4b4f-81b3-9c6152d9eb96]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe25:f3ea'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 496589, 'tstamp': 496589}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211291, 'error': None, 'target': 'ovnmeta-567e8645-0094-48f0-9603-67223f9e4c7a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:27:23.355 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[be0ef7c6-dc43-4bbd-b433-004c20cc6019]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap567e8645-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:25:f3:ea'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 47], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 496589, 'reachable_time': 41334, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 211292, 'error': None, 'target': 'ovnmeta-567e8645-0094-48f0-9603-67223f9e4c7a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:27:23 compute-1 nova_compute[183403]: 2026-01-26 15:27:23.379 183407 DEBUG nova.compute.manager [req-1fd469e5-ce69-4052-bc06-d469087e8cc0 req-7ba58186-8629-46e6-b13d-1c286cda5c0a 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Received event network-vif-plugged-0cda9ffa-32fa-4511-a584-31b46b813df6 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:27:23 compute-1 nova_compute[183403]: 2026-01-26 15:27:23.380 183407 DEBUG oslo_concurrency.lockutils [req-1fd469e5-ce69-4052-bc06-d469087e8cc0 req-7ba58186-8629-46e6-b13d-1c286cda5c0a 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "1d1f8313-3681-4a88-9aef-3c69f49aaa19-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:27:23 compute-1 nova_compute[183403]: 2026-01-26 15:27:23.380 183407 DEBUG oslo_concurrency.lockutils [req-1fd469e5-ce69-4052-bc06-d469087e8cc0 req-7ba58186-8629-46e6-b13d-1c286cda5c0a 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "1d1f8313-3681-4a88-9aef-3c69f49aaa19-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:27:23 compute-1 nova_compute[183403]: 2026-01-26 15:27:23.381 183407 DEBUG oslo_concurrency.lockutils [req-1fd469e5-ce69-4052-bc06-d469087e8cc0 req-7ba58186-8629-46e6-b13d-1c286cda5c0a 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "1d1f8313-3681-4a88-9aef-3c69f49aaa19-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:27:23 compute-1 nova_compute[183403]: 2026-01-26 15:27:23.381 183407 DEBUG nova.compute.manager [req-1fd469e5-ce69-4052-bc06-d469087e8cc0 req-7ba58186-8629-46e6-b13d-1c286cda5c0a 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Processing event network-vif-plugged-0cda9ffa-32fa-4511-a584-31b46b813df6 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:27:23.407 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[86e65bb4-c717-4934-8437-e1d919909d2e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:27:23.504 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[c659a1af-88d6-4baa-95b5-fcc493a49aa7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:27:23.506 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap567e8645-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:27:23.506 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:27:23.507 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap567e8645-00, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:27:23 compute-1 NetworkManager[55716]: <info>  [1769441243.5098] manager: (tap567e8645-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/66)
Jan 26 15:27:23 compute-1 kernel: tap567e8645-00: entered promiscuous mode
Jan 26 15:27:23 compute-1 nova_compute[183403]: 2026-01-26 15:27:23.511 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:27:23.517 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap567e8645-00, col_values=(('external_ids', {'iface-id': '3d7320e6-f866-41a6-98db-038f98178a52'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:27:23 compute-1 ovn_controller[95641]: 2026-01-26T15:27:23Z|00167|binding|INFO|Releasing lport 3d7320e6-f866-41a6-98db-038f98178a52 from this chassis (sb_readonly=0)
Jan 26 15:27:23 compute-1 nova_compute[183403]: 2026-01-26 15:27:23.519 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:27:23.523 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[bc50df07-5dc1-482d-bd47-e018e3b31b55]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:27:23.524 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/567e8645-0094-48f0-9603-67223f9e4c7a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/567e8645-0094-48f0-9603-67223f9e4c7a.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:27:23.524 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/567e8645-0094-48f0-9603-67223f9e4c7a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/567e8645-0094-48f0-9603-67223f9e4c7a.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:27:23.525 104930 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 567e8645-0094-48f0-9603-67223f9e4c7a disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:27:23.525 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/567e8645-0094-48f0-9603-67223f9e4c7a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/567e8645-0094-48f0-9603-67223f9e4c7a.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:27:23.526 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[a4565621-5a62-46be-91da-5ad0bc25bde4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:27:23.527 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/567e8645-0094-48f0-9603-67223f9e4c7a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/567e8645-0094-48f0-9603-67223f9e4c7a.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:27:23.527 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[43a5836e-6831-42a3-ad5a-7b372c879c93]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:27:23.528 104930 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]: global
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]:     log         /dev/log local0 debug
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]:     log-tag     haproxy-metadata-proxy-567e8645-0094-48f0-9603-67223f9e4c7a
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]:     user        root
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]:     group       root
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]:     maxconn     1024
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]:     pidfile     /var/lib/neutron/external/pids/567e8645-0094-48f0-9603-67223f9e4c7a.pid.haproxy
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]:     daemon
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]: 
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]: defaults
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]:     log global
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]:     mode http
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]:     option httplog
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]:     option dontlognull
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]:     option http-server-close
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]:     option forwardfor
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]:     retries                 3
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]:     timeout http-request    30s
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]:     timeout connect         30s
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]:     timeout client          32s
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]:     timeout server          32s
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]:     timeout http-keep-alive 30s
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]: 
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]: listen listener
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]:     bind 169.254.169.254:80
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]:     
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]:     server metadata /var/lib/neutron/metadata_proxy
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]: 
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]:     http-request add-header X-OVN-Network-ID 567e8645-0094-48f0-9603-67223f9e4c7a
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Jan 26 15:27:23 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:27:23.529 104930 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-567e8645-0094-48f0-9603-67223f9e4c7a', 'env', 'PROCESS_TAG=haproxy-567e8645-0094-48f0-9603-67223f9e4c7a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/567e8645-0094-48f0-9603-67223f9e4c7a.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Jan 26 15:27:23 compute-1 nova_compute[183403]: 2026-01-26 15:27:23.530 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:27:23 compute-1 nova_compute[183403]: 2026-01-26 15:27:23.554 183407 DEBUG nova.compute.manager [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 26 15:27:23 compute-1 nova_compute[183403]: 2026-01-26 15:27:23.571 183407 DEBUG nova.virt.libvirt.driver [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Jan 26 15:27:23 compute-1 nova_compute[183403]: 2026-01-26 15:27:23.576 183407 INFO nova.virt.libvirt.driver [-] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Instance spawned successfully.
Jan 26 15:27:23 compute-1 nova_compute[183403]: 2026-01-26 15:27:23.576 183407 DEBUG nova.virt.libvirt.driver [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Jan 26 15:27:24 compute-1 podman[211329]: 2026-01-26 15:27:24.006947646 +0000 UTC m=+0.074003096 container create 0979b1573bc7ec96131bcd5320d71176ec08239accc4e0539fed8499a8e648ec (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-567e8645-0094-48f0-9603-67223f9e4c7a, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 26 15:27:24 compute-1 podman[211329]: 2026-01-26 15:27:23.963777241 +0000 UTC m=+0.030832721 image pull d5bf96c5225682608353c2a38183b39c74c7c48343b54a579b3b6f3d81996637 38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Jan 26 15:27:24 compute-1 systemd[1]: Started libpod-conmon-0979b1573bc7ec96131bcd5320d71176ec08239accc4e0539fed8499a8e648ec.scope.
Jan 26 15:27:24 compute-1 nova_compute[183403]: 2026-01-26 15:27:24.091 183407 DEBUG nova.virt.libvirt.driver [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:27:24 compute-1 nova_compute[183403]: 2026-01-26 15:27:24.093 183407 DEBUG nova.virt.libvirt.driver [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:27:24 compute-1 nova_compute[183403]: 2026-01-26 15:27:24.094 183407 DEBUG nova.virt.libvirt.driver [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:27:24 compute-1 nova_compute[183403]: 2026-01-26 15:27:24.096 183407 DEBUG nova.virt.libvirt.driver [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:27:24 compute-1 nova_compute[183403]: 2026-01-26 15:27:24.096 183407 DEBUG nova.virt.libvirt.driver [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:27:24 compute-1 nova_compute[183403]: 2026-01-26 15:27:24.097 183407 DEBUG nova.virt.libvirt.driver [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:27:24 compute-1 nova_compute[183403]: 2026-01-26 15:27:24.130 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:27:24 compute-1 systemd[1]: Started libcrun container.
Jan 26 15:27:24 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cce6bc5f7f94890e68982a3647763c8fc0e8213c3a852cd0f725f989e8997169/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 15:27:24 compute-1 podman[211329]: 2026-01-26 15:27:24.169877263 +0000 UTC m=+0.236932733 container init 0979b1573bc7ec96131bcd5320d71176ec08239accc4e0539fed8499a8e648ec (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-567e8645-0094-48f0-9603-67223f9e4c7a, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 26 15:27:24 compute-1 podman[211329]: 2026-01-26 15:27:24.175581939 +0000 UTC m=+0.242637349 container start 0979b1573bc7ec96131bcd5320d71176ec08239accc4e0539fed8499a8e648ec (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-567e8645-0094-48f0-9603-67223f9e4c7a, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260120, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Jan 26 15:27:24 compute-1 neutron-haproxy-ovnmeta-567e8645-0094-48f0-9603-67223f9e4c7a[211344]: [NOTICE]   (211348) : New worker (211350) forked
Jan 26 15:27:24 compute-1 neutron-haproxy-ovnmeta-567e8645-0094-48f0-9603-67223f9e4c7a[211344]: [NOTICE]   (211348) : Loading success.
Jan 26 15:27:24 compute-1 nova_compute[183403]: 2026-01-26 15:27:24.337 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Instance 1d1f8313-3681-4a88-9aef-3c69f49aaa19 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 15:27:24 compute-1 nova_compute[183403]: 2026-01-26 15:27:24.338 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:27:24 compute-1 nova_compute[183403]: 2026-01-26 15:27:24.338 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:27:23 up  1:22,  0 user,  load average: 0.11, 0.21, 0.26\n', 'num_instances': '1', 'num_vm_building': '1', 'num_task_spawning': '1', 'num_os_type_None': '1', 'num_proj_ab5cf25b2abc42399ccb7131f5e1e913': '1', 'io_workload': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:27:24 compute-1 nova_compute[183403]: 2026-01-26 15:27:24.387 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Refreshing inventories for resource provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Jan 26 15:27:24 compute-1 nova_compute[183403]: 2026-01-26 15:27:24.433 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Updating ProviderTree inventory for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Jan 26 15:27:24 compute-1 nova_compute[183403]: 2026-01-26 15:27:24.434 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Updating inventory in ProviderTree for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Jan 26 15:27:24 compute-1 nova_compute[183403]: 2026-01-26 15:27:24.446 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Refreshing aggregate associations for resource provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Jan 26 15:27:24 compute-1 nova_compute[183403]: 2026-01-26 15:27:24.462 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Refreshing trait associations for resource provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67, traits: COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE41,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NODE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_MMX,COMPUTE_ACCELERATORS,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_ARCH_X86_64,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE2,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_SOUND_MODEL_USB,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_TIS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_DEVICE_TAGGING,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_CRB,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_SOUND_MODEL_SB16,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_QCOW2,HW_ARCH_X86_64,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SECURITY_TPM_2_0 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Jan 26 15:27:24 compute-1 nova_compute[183403]: 2026-01-26 15:27:24.495 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:27:24 compute-1 nova_compute[183403]: 2026-01-26 15:27:24.723 183407 INFO nova.compute.manager [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Took 12.31 seconds to spawn the instance on the hypervisor.
Jan 26 15:27:24 compute-1 nova_compute[183403]: 2026-01-26 15:27:24.724 183407 DEBUG nova.compute.manager [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Jan 26 15:27:25 compute-1 nova_compute[183403]: 2026-01-26 15:27:25.053 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:27:25 compute-1 nova_compute[183403]: 2026-01-26 15:27:25.341 183407 INFO nova.compute.manager [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Took 17.62 seconds to build instance.
Jan 26 15:27:25 compute-1 nova_compute[183403]: 2026-01-26 15:27:25.446 183407 DEBUG nova.compute.manager [req-e738c068-5a95-4dbd-b298-ab07d9cc5ccd req-349abc59-b76b-442d-8785-c3990ae9261a 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Received event network-vif-plugged-0cda9ffa-32fa-4511-a584-31b46b813df6 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:27:25 compute-1 nova_compute[183403]: 2026-01-26 15:27:25.447 183407 DEBUG oslo_concurrency.lockutils [req-e738c068-5a95-4dbd-b298-ab07d9cc5ccd req-349abc59-b76b-442d-8785-c3990ae9261a 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "1d1f8313-3681-4a88-9aef-3c69f49aaa19-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:27:25 compute-1 nova_compute[183403]: 2026-01-26 15:27:25.447 183407 DEBUG oslo_concurrency.lockutils [req-e738c068-5a95-4dbd-b298-ab07d9cc5ccd req-349abc59-b76b-442d-8785-c3990ae9261a 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "1d1f8313-3681-4a88-9aef-3c69f49aaa19-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:27:25 compute-1 nova_compute[183403]: 2026-01-26 15:27:25.447 183407 DEBUG oslo_concurrency.lockutils [req-e738c068-5a95-4dbd-b298-ab07d9cc5ccd req-349abc59-b76b-442d-8785-c3990ae9261a 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "1d1f8313-3681-4a88-9aef-3c69f49aaa19-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:27:25 compute-1 nova_compute[183403]: 2026-01-26 15:27:25.447 183407 DEBUG nova.compute.manager [req-e738c068-5a95-4dbd-b298-ab07d9cc5ccd req-349abc59-b76b-442d-8785-c3990ae9261a 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] No waiting events found dispatching network-vif-plugged-0cda9ffa-32fa-4511-a584-31b46b813df6 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:27:25 compute-1 nova_compute[183403]: 2026-01-26 15:27:25.447 183407 WARNING nova.compute.manager [req-e738c068-5a95-4dbd-b298-ab07d9cc5ccd req-349abc59-b76b-442d-8785-c3990ae9261a 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Received unexpected event network-vif-plugged-0cda9ffa-32fa-4511-a584-31b46b813df6 for instance with vm_state active and task_state None.
Jan 26 15:27:25 compute-1 nova_compute[183403]: 2026-01-26 15:27:25.566 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:27:25 compute-1 nova_compute[183403]: 2026-01-26 15:27:25.567 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.308s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:27:25 compute-1 nova_compute[183403]: 2026-01-26 15:27:25.845 183407 DEBUG oslo_concurrency.lockutils [None req-49a87682-775b-4505-8402-06a119c58dcf ce151e26874c4c369f13ecc08f41d47f ab5cf25b2abc42399ccb7131f5e1e913 - - default default] Lock "1d1f8313-3681-4a88-9aef-3c69f49aaa19" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.147s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:27:25 compute-1 nova_compute[183403]: 2026-01-26 15:27:25.938 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:27:27 compute-1 nova_compute[183403]: 2026-01-26 15:27:27.568 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:27:27 compute-1 nova_compute[183403]: 2026-01-26 15:27:27.568 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:27:27 compute-1 nova_compute[183403]: 2026-01-26 15:27:27.569 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:27:27 compute-1 nova_compute[183403]: 2026-01-26 15:27:27.569 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:27:27 compute-1 nova_compute[183403]: 2026-01-26 15:27:27.569 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:27:27 compute-1 nova_compute[183403]: 2026-01-26 15:27:27.570 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 15:27:27 compute-1 podman[211360]: 2026-01-26 15:27:27.979757182 +0000 UTC m=+0.140340502 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 26 15:27:27 compute-1 podman[211359]: 2026-01-26 15:27:27.990076134 +0000 UTC m=+0.161039907 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 15:27:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:27:29.074 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:27:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:27:29.075 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:27:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:27:29.075 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:27:29 compute-1 nova_compute[183403]: 2026-01-26 15:27:29.156 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:27:30 compute-1 nova_compute[183403]: 2026-01-26 15:27:30.941 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:27:34 compute-1 nova_compute[183403]: 2026-01-26 15:27:34.210 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:27:35 compute-1 podman[192725]: time="2026-01-26T15:27:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:27:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:27:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16575 "" "Go-http-client/1.1"
Jan 26 15:27:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:27:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2656 "" "Go-http-client/1.1"
Jan 26 15:27:35 compute-1 nova_compute[183403]: 2026-01-26 15:27:35.943 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:27:36 compute-1 ovn_controller[95641]: 2026-01-26T15:27:36Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e5:ee:5d 10.100.0.12
Jan 26 15:27:36 compute-1 ovn_controller[95641]: 2026-01-26T15:27:36Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e5:ee:5d 10.100.0.12
Jan 26 15:27:39 compute-1 nova_compute[183403]: 2026-01-26 15:27:39.247 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:27:40 compute-1 nova_compute[183403]: 2026-01-26 15:27:40.946 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:27:44 compute-1 nova_compute[183403]: 2026-01-26 15:27:44.305 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:27:45 compute-1 nova_compute[183403]: 2026-01-26 15:27:45.948 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:27:47 compute-1 podman[211431]: 2026-01-26 15:27:47.881157294 +0000 UTC m=+0.061269540 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 26 15:27:47 compute-1 podman[211432]: 2026-01-26 15:27:47.901297212 +0000 UTC m=+0.071481838 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, distribution-scope=public, architecture=x86_64, managed_by=edpm_ansible, config_id=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, version=9.6, build-date=2025-08-20T13:12:41)
Jan 26 15:27:48 compute-1 nova_compute[183403]: 2026-01-26 15:27:48.348 183407 DEBUG nova.compute.manager [None req-21a04399-03ed-4e03-958d-0a1adbdf3c18 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:636
Jan 26 15:27:48 compute-1 nova_compute[183403]: 2026-01-26 15:27:48.400 183407 DEBUG nova.compute.provider_tree [None req-21a04399-03ed-4e03-958d-0a1adbdf3c18 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Updating resource provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 generation from 30 to 31 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Jan 26 15:27:49 compute-1 nova_compute[183403]: 2026-01-26 15:27:49.359 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:27:49 compute-1 openstack_network_exporter[195610]: ERROR   15:27:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:27:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:27:49 compute-1 openstack_network_exporter[195610]: ERROR   15:27:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:27:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:27:50 compute-1 nova_compute[183403]: 2026-01-26 15:27:50.950 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:27:53 compute-1 ovn_controller[95641]: 2026-01-26T15:27:53Z|00168|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Jan 26 15:27:54 compute-1 nova_compute[183403]: 2026-01-26 15:27:54.361 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:27:55 compute-1 nova_compute[183403]: 2026-01-26 15:27:55.953 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:27:58 compute-1 nova_compute[183403]: 2026-01-26 15:27:58.398 183407 DEBUG nova.virt.libvirt.driver [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Check if temp file /var/lib/nova/instances/tmp4em9vec8 exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Jan 26 15:27:58 compute-1 nova_compute[183403]: 2026-01-26 15:27:58.404 183407 DEBUG nova.compute.manager [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp4em9vec8',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='1d1f8313-3681-4a88-9aef-3c69f49aaa19',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9298
Jan 26 15:27:58 compute-1 podman[211479]: 2026-01-26 15:27:58.916574416 +0000 UTC m=+0.088410680 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent)
Jan 26 15:27:58 compute-1 podman[211478]: 2026-01-26 15:27:58.974086061 +0000 UTC m=+0.146690467 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260120, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 15:27:59 compute-1 nova_compute[183403]: 2026-01-26 15:27:59.397 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:28:00 compute-1 nova_compute[183403]: 2026-01-26 15:28:00.955 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:28:03 compute-1 nova_compute[183403]: 2026-01-26 15:28:03.201 183407 DEBUG oslo_concurrency.processutils [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1d1f8313-3681-4a88-9aef-3c69f49aaa19/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:28:03 compute-1 nova_compute[183403]: 2026-01-26 15:28:03.291 183407 DEBUG oslo_concurrency.processutils [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1d1f8313-3681-4a88-9aef-3c69f49aaa19/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:28:03 compute-1 nova_compute[183403]: 2026-01-26 15:28:03.293 183407 DEBUG oslo_concurrency.processutils [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1d1f8313-3681-4a88-9aef-3c69f49aaa19/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:28:03 compute-1 nova_compute[183403]: 2026-01-26 15:28:03.384 183407 DEBUG oslo_concurrency.processutils [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1d1f8313-3681-4a88-9aef-3c69f49aaa19/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:28:03 compute-1 nova_compute[183403]: 2026-01-26 15:28:03.386 183407 DEBUG nova.compute.manager [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Preparing to wait for external event network-vif-plugged-0cda9ffa-32fa-4511-a584-31b46b813df6 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 26 15:28:03 compute-1 nova_compute[183403]: 2026-01-26 15:28:03.387 183407 DEBUG oslo_concurrency.lockutils [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "1d1f8313-3681-4a88-9aef-3c69f49aaa19-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:28:03 compute-1 nova_compute[183403]: 2026-01-26 15:28:03.388 183407 DEBUG oslo_concurrency.lockutils [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "1d1f8313-3681-4a88-9aef-3c69f49aaa19-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:28:03 compute-1 nova_compute[183403]: 2026-01-26 15:28:03.388 183407 DEBUG oslo_concurrency.lockutils [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "1d1f8313-3681-4a88-9aef-3c69f49aaa19-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:28:04 compute-1 nova_compute[183403]: 2026-01-26 15:28:04.431 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:28:05 compute-1 podman[192725]: time="2026-01-26T15:28:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:28:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:28:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16575 "" "Go-http-client/1.1"
Jan 26 15:28:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:28:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2662 "" "Go-http-client/1.1"
Jan 26 15:28:05 compute-1 nova_compute[183403]: 2026-01-26 15:28:05.957 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:28:09 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:28:09.378 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:ac:38', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'de:96:40:90:f3:e5'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:28:09 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:28:09.379 104930 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 15:28:09 compute-1 nova_compute[183403]: 2026-01-26 15:28:09.394 183407 DEBUG nova.compute.manager [req-4ba44fee-639c-44ee-bd61-9665cb7c1d37 req-92176e2d-bc44-4ca7-b750-55e694fa731e 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Received event network-vif-unplugged-0cda9ffa-32fa-4511-a584-31b46b813df6 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:28:09 compute-1 nova_compute[183403]: 2026-01-26 15:28:09.394 183407 DEBUG oslo_concurrency.lockutils [req-4ba44fee-639c-44ee-bd61-9665cb7c1d37 req-92176e2d-bc44-4ca7-b750-55e694fa731e 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "1d1f8313-3681-4a88-9aef-3c69f49aaa19-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:28:09 compute-1 nova_compute[183403]: 2026-01-26 15:28:09.394 183407 DEBUG oslo_concurrency.lockutils [req-4ba44fee-639c-44ee-bd61-9665cb7c1d37 req-92176e2d-bc44-4ca7-b750-55e694fa731e 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "1d1f8313-3681-4a88-9aef-3c69f49aaa19-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:28:09 compute-1 nova_compute[183403]: 2026-01-26 15:28:09.395 183407 DEBUG oslo_concurrency.lockutils [req-4ba44fee-639c-44ee-bd61-9665cb7c1d37 req-92176e2d-bc44-4ca7-b750-55e694fa731e 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "1d1f8313-3681-4a88-9aef-3c69f49aaa19-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:28:09 compute-1 nova_compute[183403]: 2026-01-26 15:28:09.395 183407 DEBUG nova.compute.manager [req-4ba44fee-639c-44ee-bd61-9665cb7c1d37 req-92176e2d-bc44-4ca7-b750-55e694fa731e 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] No event matching network-vif-unplugged-0cda9ffa-32fa-4511-a584-31b46b813df6 in dict_keys([('network-vif-plugged', '0cda9ffa-32fa-4511-a584-31b46b813df6')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:350
Jan 26 15:28:09 compute-1 nova_compute[183403]: 2026-01-26 15:28:09.395 183407 DEBUG nova.compute.manager [req-4ba44fee-639c-44ee-bd61-9665cb7c1d37 req-92176e2d-bc44-4ca7-b750-55e694fa731e 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Received event network-vif-unplugged-0cda9ffa-32fa-4511-a584-31b46b813df6 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 15:28:09 compute-1 nova_compute[183403]: 2026-01-26 15:28:09.418 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:28:09 compute-1 nova_compute[183403]: 2026-01-26 15:28:09.433 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:28:10 compute-1 nova_compute[183403]: 2026-01-26 15:28:10.942 183407 INFO nova.compute.manager [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Took 7.55 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.
Jan 26 15:28:10 compute-1 nova_compute[183403]: 2026-01-26 15:28:10.960 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:28:11 compute-1 nova_compute[183403]: 2026-01-26 15:28:11.453 183407 DEBUG nova.compute.manager [req-9d9e072c-174c-4468-a886-4dc8e09e4980 req-b543eaeb-02d0-4385-b65f-6816ad479615 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Received event network-vif-plugged-0cda9ffa-32fa-4511-a584-31b46b813df6 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:28:11 compute-1 nova_compute[183403]: 2026-01-26 15:28:11.454 183407 DEBUG oslo_concurrency.lockutils [req-9d9e072c-174c-4468-a886-4dc8e09e4980 req-b543eaeb-02d0-4385-b65f-6816ad479615 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "1d1f8313-3681-4a88-9aef-3c69f49aaa19-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:28:11 compute-1 nova_compute[183403]: 2026-01-26 15:28:11.454 183407 DEBUG oslo_concurrency.lockutils [req-9d9e072c-174c-4468-a886-4dc8e09e4980 req-b543eaeb-02d0-4385-b65f-6816ad479615 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "1d1f8313-3681-4a88-9aef-3c69f49aaa19-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:28:11 compute-1 nova_compute[183403]: 2026-01-26 15:28:11.455 183407 DEBUG oslo_concurrency.lockutils [req-9d9e072c-174c-4468-a886-4dc8e09e4980 req-b543eaeb-02d0-4385-b65f-6816ad479615 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "1d1f8313-3681-4a88-9aef-3c69f49aaa19-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:28:11 compute-1 nova_compute[183403]: 2026-01-26 15:28:11.455 183407 DEBUG nova.compute.manager [req-9d9e072c-174c-4468-a886-4dc8e09e4980 req-b543eaeb-02d0-4385-b65f-6816ad479615 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Processing event network-vif-plugged-0cda9ffa-32fa-4511-a584-31b46b813df6 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 26 15:28:11 compute-1 nova_compute[183403]: 2026-01-26 15:28:11.456 183407 DEBUG nova.compute.manager [req-9d9e072c-174c-4468-a886-4dc8e09e4980 req-b543eaeb-02d0-4385-b65f-6816ad479615 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Received event network-changed-0cda9ffa-32fa-4511-a584-31b46b813df6 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:28:11 compute-1 nova_compute[183403]: 2026-01-26 15:28:11.456 183407 DEBUG nova.compute.manager [req-9d9e072c-174c-4468-a886-4dc8e09e4980 req-b543eaeb-02d0-4385-b65f-6816ad479615 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Refreshing instance network info cache due to event network-changed-0cda9ffa-32fa-4511-a584-31b46b813df6. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 26 15:28:11 compute-1 nova_compute[183403]: 2026-01-26 15:28:11.456 183407 DEBUG oslo_concurrency.lockutils [req-9d9e072c-174c-4468-a886-4dc8e09e4980 req-b543eaeb-02d0-4385-b65f-6816ad479615 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "refresh_cache-1d1f8313-3681-4a88-9aef-3c69f49aaa19" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 15:28:11 compute-1 nova_compute[183403]: 2026-01-26 15:28:11.457 183407 DEBUG oslo_concurrency.lockutils [req-9d9e072c-174c-4468-a886-4dc8e09e4980 req-b543eaeb-02d0-4385-b65f-6816ad479615 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquired lock "refresh_cache-1d1f8313-3681-4a88-9aef-3c69f49aaa19" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 15:28:11 compute-1 nova_compute[183403]: 2026-01-26 15:28:11.457 183407 DEBUG nova.network.neutron [req-9d9e072c-174c-4468-a886-4dc8e09e4980 req-b543eaeb-02d0-4385-b65f-6816ad479615 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Refreshing network info cache for port 0cda9ffa-32fa-4511-a584-31b46b813df6 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 26 15:28:11 compute-1 nova_compute[183403]: 2026-01-26 15:28:11.460 183407 DEBUG nova.compute.manager [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 26 15:28:11 compute-1 nova_compute[183403]: 2026-01-26 15:28:11.968 183407 WARNING neutronclient.v2_0.client [req-9d9e072c-174c-4468-a886-4dc8e09e4980 req-b543eaeb-02d0-4385-b65f-6816ad479615 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:28:11 compute-1 nova_compute[183403]: 2026-01-26 15:28:11.975 183407 DEBUG nova.compute.manager [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp4em9vec8',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='1d1f8313-3681-4a88-9aef-3c69f49aaa19',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(99e0ea5b-2d8c-46b9-ac63-0aa28bc2aae8),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9663
Jan 26 15:28:12 compute-1 nova_compute[183403]: 2026-01-26 15:28:12.429 183407 WARNING neutronclient.v2_0.client [req-9d9e072c-174c-4468-a886-4dc8e09e4980 req-b543eaeb-02d0-4385-b65f-6816ad479615 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:28:12 compute-1 nova_compute[183403]: 2026-01-26 15:28:12.490 183407 DEBUG nova.objects.instance [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lazy-loading 'migration_context' on Instance uuid 1d1f8313-3681-4a88-9aef-3c69f49aaa19 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:28:12 compute-1 nova_compute[183403]: 2026-01-26 15:28:12.492 183407 DEBUG nova.virt.libvirt.driver [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Jan 26 15:28:12 compute-1 nova_compute[183403]: 2026-01-26 15:28:12.495 183407 DEBUG nova.virt.libvirt.driver [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Jan 26 15:28:12 compute-1 nova_compute[183403]: 2026-01-26 15:28:12.495 183407 DEBUG nova.virt.libvirt.driver [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Jan 26 15:28:12 compute-1 nova_compute[183403]: 2026-01-26 15:28:12.998 183407 DEBUG nova.virt.libvirt.driver [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Jan 26 15:28:12 compute-1 nova_compute[183403]: 2026-01-26 15:28:12.998 183407 DEBUG nova.virt.libvirt.driver [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Jan 26 15:28:13 compute-1 nova_compute[183403]: 2026-01-26 15:28:13.009 183407 DEBUG nova.virt.libvirt.vif [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2026-01-26T15:27:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-1195854132',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-1195854132',id=21,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T15:27:24Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ab5cf25b2abc42399ccb7131f5e1e913',ramdisk_id='',reservation_id='r-vqqlw3za',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,reader,member,admin',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1308243436',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1308243436-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T15:27:24Z,user_data=None,user_id='ce151e26874c4c369f13ecc08f41d47f',uuid=1d1f8313-3681-4a88-9aef-3c69f49aaa19,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0cda9ffa-32fa-4511-a584-31b46b813df6", "address": "fa:16:3e:e5:ee:5d", "network": {"id": "567e8645-0094-48f0-9603-67223f9e4c7a", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1901535032-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4997f838a71499eb0b82dabfe381bfe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap0cda9ffa-32", "ovs_interfaceid": "0cda9ffa-32fa-4511-a584-31b46b813df6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 26 15:28:13 compute-1 nova_compute[183403]: 2026-01-26 15:28:13.009 183407 DEBUG nova.network.os_vif_util [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Converting VIF {"id": "0cda9ffa-32fa-4511-a584-31b46b813df6", "address": "fa:16:3e:e5:ee:5d", "network": {"id": "567e8645-0094-48f0-9603-67223f9e4c7a", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1901535032-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4997f838a71499eb0b82dabfe381bfe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap0cda9ffa-32", "ovs_interfaceid": "0cda9ffa-32fa-4511-a584-31b46b813df6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:28:13 compute-1 nova_compute[183403]: 2026-01-26 15:28:13.010 183407 DEBUG nova.network.os_vif_util [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e5:ee:5d,bridge_name='br-int',has_traffic_filtering=True,id=0cda9ffa-32fa-4511-a584-31b46b813df6,network=Network(567e8645-0094-48f0-9603-67223f9e4c7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0cda9ffa-32') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:28:13 compute-1 nova_compute[183403]: 2026-01-26 15:28:13.011 183407 DEBUG nova.virt.libvirt.migration [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Updating guest XML with vif config: <interface type="ethernet">
Jan 26 15:28:13 compute-1 nova_compute[183403]:   <mac address="fa:16:3e:e5:ee:5d"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   <model type="virtio"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   <driver name="vhost" rx_queue_size="512"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   <mtu size="1442"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   <target dev="tap0cda9ffa-32"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]: </interface>
Jan 26 15:28:13 compute-1 nova_compute[183403]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Jan 26 15:28:13 compute-1 nova_compute[183403]: 2026-01-26 15:28:13.012 183407 DEBUG nova.virt.libvirt.migration [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Jan 26 15:28:13 compute-1 nova_compute[183403]:   <name>instance-00000015</name>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   <uuid>1d1f8313-3681-4a88-9aef-3c69f49aaa19</uuid>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   <metadata>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <nova:name>tempest-TestExecuteVmWorkloadBalanceStrategy-server-1195854132</nova:name>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <nova:creationTime>2026-01-26 15:27:18</nova:creationTime>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <nova:flavor name="m1.nano" id="74480e15-23e6-4569-8ef9-3ddf5ac8b981">
Jan 26 15:28:13 compute-1 nova_compute[183403]:         <nova:memory>128</nova:memory>
Jan 26 15:28:13 compute-1 nova_compute[183403]:         <nova:disk>1</nova:disk>
Jan 26 15:28:13 compute-1 nova_compute[183403]:         <nova:swap>0</nova:swap>
Jan 26 15:28:13 compute-1 nova_compute[183403]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 15:28:13 compute-1 nova_compute[183403]:         <nova:vcpus>1</nova:vcpus>
Jan 26 15:28:13 compute-1 nova_compute[183403]:         <nova:extraSpecs>
Jan 26 15:28:13 compute-1 nova_compute[183403]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 15:28:13 compute-1 nova_compute[183403]:         </nova:extraSpecs>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       </nova:flavor>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <nova:image uuid="354e4d0e-4287-404f-93d3-2c85cfe92fbc">
Jan 26 15:28:13 compute-1 nova_compute[183403]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 15:28:13 compute-1 nova_compute[183403]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 15:28:13 compute-1 nova_compute[183403]:         <nova:minDisk>1</nova:minDisk>
Jan 26 15:28:13 compute-1 nova_compute[183403]:         <nova:minRam>0</nova:minRam>
Jan 26 15:28:13 compute-1 nova_compute[183403]:         <nova:properties>
Jan 26 15:28:13 compute-1 nova_compute[183403]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 15:28:13 compute-1 nova_compute[183403]:         </nova:properties>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       </nova:image>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <nova:owner>
Jan 26 15:28:13 compute-1 nova_compute[183403]:         <nova:user uuid="ce151e26874c4c369f13ecc08f41d47f">tempest-TestExecuteVmWorkloadBalanceStrategy-1308243436-project-admin</nova:user>
Jan 26 15:28:13 compute-1 nova_compute[183403]:         <nova:project uuid="ab5cf25b2abc42399ccb7131f5e1e913">tempest-TestExecuteVmWorkloadBalanceStrategy-1308243436</nova:project>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       </nova:owner>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <nova:root type="image" uuid="354e4d0e-4287-404f-93d3-2c85cfe92fbc"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <nova:ports>
Jan 26 15:28:13 compute-1 nova_compute[183403]:         <nova:port uuid="0cda9ffa-32fa-4511-a584-31b46b813df6">
Jan 26 15:28:13 compute-1 nova_compute[183403]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:         </nova:port>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       </nova:ports>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </nova:instance>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   </metadata>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   <memory unit="KiB">131072</memory>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   <currentMemory unit="KiB">131072</currentMemory>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   <vcpu placement="static">1</vcpu>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   <resource>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <partition>/machine</partition>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   </resource>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   <sysinfo type="smbios">
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <system>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <entry name="manufacturer">RDO</entry>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <entry name="product">OpenStack Compute</entry>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <entry name="serial">1d1f8313-3681-4a88-9aef-3c69f49aaa19</entry>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <entry name="uuid">1d1f8313-3681-4a88-9aef-3c69f49aaa19</entry>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <entry name="family">Virtual Machine</entry>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </system>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   </sysinfo>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   <os>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <boot dev="hd"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <smbios mode="sysinfo"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   </os>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   <features>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <acpi/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <apic/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <vmcoreinfo state="on"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   </features>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   <cpu mode="custom" match="exact" check="partial">
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <model fallback="allow">Nehalem</model>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   </cpu>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   <clock offset="utc">
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <timer name="hpet" present="no"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   </clock>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   <on_poweroff>destroy</on_poweroff>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   <on_reboot>restart</on_reboot>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   <on_crash>destroy</on_crash>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   <devices>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <disk type="file" device="disk">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <source file="/var/lib/nova/instances/1d1f8313-3681-4a88-9aef-3c69f49aaa19/disk"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target dev="vda" bus="virtio"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </disk>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <disk type="file" device="cdrom">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <source file="/var/lib/nova/instances/1d1f8313-3681-4a88-9aef-3c69f49aaa19/disk.config"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target dev="sda" bus="sata"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <readonly/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </disk>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="0" model="pcie-root"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="1" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="1" port="0x10"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="2" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="2" port="0x11"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="3" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="3" port="0x12"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="4" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="4" port="0x13"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="5" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="5" port="0x14"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="6" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="6" port="0x15"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="7" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="7" port="0x16"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="8" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="8" port="0x17"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="9" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="9" port="0x18"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="10" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="10" port="0x19"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="11" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="11" port="0x1a"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="12" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="12" port="0x1b"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="13" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="13" port="0x1c"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="14" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="14" port="0x1d"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="15" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="15" port="0x1e"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="16" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="16" port="0x1f"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="17" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="17" port="0x20"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="18" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="18" port="0x21"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="19" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="19" port="0x22"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="20" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="20" port="0x23"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="21" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="21" port="0x24"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="22" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="22" port="0x25"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="23" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="23" port="0x26"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="24" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="24" port="0x27"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="25" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="25" port="0x28"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-pci-bridge"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="usb" index="0" model="piix3-uhci">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="sata" index="0">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <interface type="ethernet"><mac address="fa:16:3e:e5:ee:5d"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap0cda9ffa-32"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </interface><serial type="pty">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <log file="/var/lib/nova/instances/1d1f8313-3681-4a88-9aef-3c69f49aaa19/console.log" append="off"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target type="isa-serial" port="0">
Jan 26 15:28:13 compute-1 nova_compute[183403]:         <model name="isa-serial"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       </target>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </serial>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <console type="pty">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <log file="/var/lib/nova/instances/1d1f8313-3681-4a88-9aef-3c69f49aaa19/console.log" append="off"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target type="serial" port="0"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </console>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <input type="tablet" bus="usb">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="usb" bus="0" port="1"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </input>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <input type="mouse" bus="ps2"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <listen type="address" address="::"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </graphics>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <video>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model type="virtio" heads="1" primary="yes"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </video>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <stats period="10"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </memballoon>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <rng model="virtio">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <backend model="random">/dev/urandom</backend>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </rng>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   </devices>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]: </domain>
Jan 26 15:28:13 compute-1 nova_compute[183403]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Jan 26 15:28:13 compute-1 nova_compute[183403]: 2026-01-26 15:28:13.013 183407 DEBUG nova.virt.libvirt.migration [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Jan 26 15:28:13 compute-1 nova_compute[183403]:   <name>instance-00000015</name>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   <uuid>1d1f8313-3681-4a88-9aef-3c69f49aaa19</uuid>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   <metadata>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <nova:name>tempest-TestExecuteVmWorkloadBalanceStrategy-server-1195854132</nova:name>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <nova:creationTime>2026-01-26 15:27:18</nova:creationTime>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <nova:flavor name="m1.nano" id="74480e15-23e6-4569-8ef9-3ddf5ac8b981">
Jan 26 15:28:13 compute-1 nova_compute[183403]:         <nova:memory>128</nova:memory>
Jan 26 15:28:13 compute-1 nova_compute[183403]:         <nova:disk>1</nova:disk>
Jan 26 15:28:13 compute-1 nova_compute[183403]:         <nova:swap>0</nova:swap>
Jan 26 15:28:13 compute-1 nova_compute[183403]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 15:28:13 compute-1 nova_compute[183403]:         <nova:vcpus>1</nova:vcpus>
Jan 26 15:28:13 compute-1 nova_compute[183403]:         <nova:extraSpecs>
Jan 26 15:28:13 compute-1 nova_compute[183403]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 15:28:13 compute-1 nova_compute[183403]:         </nova:extraSpecs>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       </nova:flavor>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <nova:image uuid="354e4d0e-4287-404f-93d3-2c85cfe92fbc">
Jan 26 15:28:13 compute-1 nova_compute[183403]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 15:28:13 compute-1 nova_compute[183403]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 15:28:13 compute-1 nova_compute[183403]:         <nova:minDisk>1</nova:minDisk>
Jan 26 15:28:13 compute-1 nova_compute[183403]:         <nova:minRam>0</nova:minRam>
Jan 26 15:28:13 compute-1 nova_compute[183403]:         <nova:properties>
Jan 26 15:28:13 compute-1 nova_compute[183403]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 15:28:13 compute-1 nova_compute[183403]:         </nova:properties>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       </nova:image>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <nova:owner>
Jan 26 15:28:13 compute-1 nova_compute[183403]:         <nova:user uuid="ce151e26874c4c369f13ecc08f41d47f">tempest-TestExecuteVmWorkloadBalanceStrategy-1308243436-project-admin</nova:user>
Jan 26 15:28:13 compute-1 nova_compute[183403]:         <nova:project uuid="ab5cf25b2abc42399ccb7131f5e1e913">tempest-TestExecuteVmWorkloadBalanceStrategy-1308243436</nova:project>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       </nova:owner>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <nova:root type="image" uuid="354e4d0e-4287-404f-93d3-2c85cfe92fbc"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <nova:ports>
Jan 26 15:28:13 compute-1 nova_compute[183403]:         <nova:port uuid="0cda9ffa-32fa-4511-a584-31b46b813df6">
Jan 26 15:28:13 compute-1 nova_compute[183403]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:         </nova:port>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       </nova:ports>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </nova:instance>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   </metadata>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   <memory unit="KiB">131072</memory>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   <currentMemory unit="KiB">131072</currentMemory>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   <vcpu placement="static">1</vcpu>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   <resource>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <partition>/machine</partition>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   </resource>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   <sysinfo type="smbios">
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <system>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <entry name="manufacturer">RDO</entry>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <entry name="product">OpenStack Compute</entry>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <entry name="serial">1d1f8313-3681-4a88-9aef-3c69f49aaa19</entry>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <entry name="uuid">1d1f8313-3681-4a88-9aef-3c69f49aaa19</entry>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <entry name="family">Virtual Machine</entry>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </system>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   </sysinfo>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   <os>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <boot dev="hd"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <smbios mode="sysinfo"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   </os>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   <features>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <acpi/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <apic/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <vmcoreinfo state="on"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   </features>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   <cpu mode="custom" match="exact" check="partial">
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <model fallback="allow">Nehalem</model>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   </cpu>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   <clock offset="utc">
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <timer name="hpet" present="no"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   </clock>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   <on_poweroff>destroy</on_poweroff>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   <on_reboot>restart</on_reboot>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   <on_crash>destroy</on_crash>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   <devices>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <disk type="file" device="disk">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <source file="/var/lib/nova/instances/1d1f8313-3681-4a88-9aef-3c69f49aaa19/disk"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target dev="vda" bus="virtio"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </disk>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <disk type="file" device="cdrom">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <source file="/var/lib/nova/instances/1d1f8313-3681-4a88-9aef-3c69f49aaa19/disk.config"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target dev="sda" bus="sata"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <readonly/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </disk>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="0" model="pcie-root"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="1" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="1" port="0x10"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="2" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="2" port="0x11"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="3" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="3" port="0x12"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="4" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="4" port="0x13"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="5" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="5" port="0x14"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="6" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="6" port="0x15"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="7" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="7" port="0x16"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="8" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="8" port="0x17"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="9" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="9" port="0x18"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="10" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="10" port="0x19"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="11" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="11" port="0x1a"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="12" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="12" port="0x1b"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="13" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="13" port="0x1c"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="14" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="14" port="0x1d"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="15" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="15" port="0x1e"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="16" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="16" port="0x1f"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="17" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="17" port="0x20"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="18" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="18" port="0x21"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="19" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="19" port="0x22"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="20" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="20" port="0x23"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="21" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="21" port="0x24"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="22" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="22" port="0x25"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="23" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="23" port="0x26"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="24" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="24" port="0x27"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="25" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="25" port="0x28"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-pci-bridge"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="usb" index="0" model="piix3-uhci">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="sata" index="0">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <interface type="ethernet"><mac address="fa:16:3e:e5:ee:5d"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap0cda9ffa-32"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </interface><serial type="pty">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <log file="/var/lib/nova/instances/1d1f8313-3681-4a88-9aef-3c69f49aaa19/console.log" append="off"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target type="isa-serial" port="0">
Jan 26 15:28:13 compute-1 nova_compute[183403]:         <model name="isa-serial"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       </target>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </serial>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <console type="pty">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <log file="/var/lib/nova/instances/1d1f8313-3681-4a88-9aef-3c69f49aaa19/console.log" append="off"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target type="serial" port="0"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </console>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <input type="tablet" bus="usb">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="usb" bus="0" port="1"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </input>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <input type="mouse" bus="ps2"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <listen type="address" address="::"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </graphics>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <video>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model type="virtio" heads="1" primary="yes"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </video>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <stats period="10"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </memballoon>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <rng model="virtio">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <backend model="random">/dev/urandom</backend>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </rng>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   </devices>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]: </domain>
Jan 26 15:28:13 compute-1 nova_compute[183403]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Jan 26 15:28:13 compute-1 nova_compute[183403]: 2026-01-26 15:28:13.014 183407 DEBUG nova.virt.libvirt.migration [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] _update_pci_xml output xml=<domain type="kvm">
Jan 26 15:28:13 compute-1 nova_compute[183403]:   <name>instance-00000015</name>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   <uuid>1d1f8313-3681-4a88-9aef-3c69f49aaa19</uuid>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   <metadata>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <nova:name>tempest-TestExecuteVmWorkloadBalanceStrategy-server-1195854132</nova:name>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <nova:creationTime>2026-01-26 15:27:18</nova:creationTime>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <nova:flavor name="m1.nano" id="74480e15-23e6-4569-8ef9-3ddf5ac8b981">
Jan 26 15:28:13 compute-1 nova_compute[183403]:         <nova:memory>128</nova:memory>
Jan 26 15:28:13 compute-1 nova_compute[183403]:         <nova:disk>1</nova:disk>
Jan 26 15:28:13 compute-1 nova_compute[183403]:         <nova:swap>0</nova:swap>
Jan 26 15:28:13 compute-1 nova_compute[183403]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 15:28:13 compute-1 nova_compute[183403]:         <nova:vcpus>1</nova:vcpus>
Jan 26 15:28:13 compute-1 nova_compute[183403]:         <nova:extraSpecs>
Jan 26 15:28:13 compute-1 nova_compute[183403]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 15:28:13 compute-1 nova_compute[183403]:         </nova:extraSpecs>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       </nova:flavor>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <nova:image uuid="354e4d0e-4287-404f-93d3-2c85cfe92fbc">
Jan 26 15:28:13 compute-1 nova_compute[183403]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 15:28:13 compute-1 nova_compute[183403]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 15:28:13 compute-1 nova_compute[183403]:         <nova:minDisk>1</nova:minDisk>
Jan 26 15:28:13 compute-1 nova_compute[183403]:         <nova:minRam>0</nova:minRam>
Jan 26 15:28:13 compute-1 nova_compute[183403]:         <nova:properties>
Jan 26 15:28:13 compute-1 nova_compute[183403]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 15:28:13 compute-1 nova_compute[183403]:         </nova:properties>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       </nova:image>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <nova:owner>
Jan 26 15:28:13 compute-1 nova_compute[183403]:         <nova:user uuid="ce151e26874c4c369f13ecc08f41d47f">tempest-TestExecuteVmWorkloadBalanceStrategy-1308243436-project-admin</nova:user>
Jan 26 15:28:13 compute-1 nova_compute[183403]:         <nova:project uuid="ab5cf25b2abc42399ccb7131f5e1e913">tempest-TestExecuteVmWorkloadBalanceStrategy-1308243436</nova:project>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       </nova:owner>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <nova:root type="image" uuid="354e4d0e-4287-404f-93d3-2c85cfe92fbc"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <nova:ports>
Jan 26 15:28:13 compute-1 nova_compute[183403]:         <nova:port uuid="0cda9ffa-32fa-4511-a584-31b46b813df6">
Jan 26 15:28:13 compute-1 nova_compute[183403]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:         </nova:port>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       </nova:ports>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </nova:instance>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   </metadata>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   <memory unit="KiB">131072</memory>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   <currentMemory unit="KiB">131072</currentMemory>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   <vcpu placement="static">1</vcpu>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   <resource>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <partition>/machine</partition>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   </resource>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   <sysinfo type="smbios">
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <system>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <entry name="manufacturer">RDO</entry>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <entry name="product">OpenStack Compute</entry>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <entry name="serial">1d1f8313-3681-4a88-9aef-3c69f49aaa19</entry>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <entry name="uuid">1d1f8313-3681-4a88-9aef-3c69f49aaa19</entry>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <entry name="family">Virtual Machine</entry>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </system>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   </sysinfo>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   <os>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <boot dev="hd"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <smbios mode="sysinfo"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   </os>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   <features>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <acpi/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <apic/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <vmcoreinfo state="on"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   </features>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   <cpu mode="custom" match="exact" check="partial">
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <model fallback="allow">Nehalem</model>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   </cpu>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   <clock offset="utc">
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <timer name="hpet" present="no"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   </clock>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   <on_poweroff>destroy</on_poweroff>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   <on_reboot>restart</on_reboot>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   <on_crash>destroy</on_crash>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   <devices>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <disk type="file" device="disk">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <source file="/var/lib/nova/instances/1d1f8313-3681-4a88-9aef-3c69f49aaa19/disk"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target dev="vda" bus="virtio"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </disk>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <disk type="file" device="cdrom">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <source file="/var/lib/nova/instances/1d1f8313-3681-4a88-9aef-3c69f49aaa19/disk.config"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target dev="sda" bus="sata"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <readonly/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </disk>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="0" model="pcie-root"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="1" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="1" port="0x10"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="2" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="2" port="0x11"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="3" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="3" port="0x12"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="4" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="4" port="0x13"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="5" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="5" port="0x14"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="6" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="6" port="0x15"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="7" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="7" port="0x16"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="8" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="8" port="0x17"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="9" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="9" port="0x18"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="10" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="10" port="0x19"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="11" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="11" port="0x1a"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="12" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="12" port="0x1b"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="13" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="13" port="0x1c"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="14" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="14" port="0x1d"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="15" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="15" port="0x1e"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="16" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="16" port="0x1f"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="17" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="17" port="0x20"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="18" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="18" port="0x21"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="19" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="19" port="0x22"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="20" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="20" port="0x23"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="21" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="21" port="0x24"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="22" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="22" port="0x25"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="23" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="23" port="0x26"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="24" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="24" port="0x27"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="25" model="pcie-root-port">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-root-port"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target chassis="25" port="0x28"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model name="pcie-pci-bridge"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="usb" index="0" model="piix3-uhci">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <controller type="sata" index="0">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </controller>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <interface type="ethernet"><mac address="fa:16:3e:e5:ee:5d"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap0cda9ffa-32"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </interface><serial type="pty">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <log file="/var/lib/nova/instances/1d1f8313-3681-4a88-9aef-3c69f49aaa19/console.log" append="off"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target type="isa-serial" port="0">
Jan 26 15:28:13 compute-1 nova_compute[183403]:         <model name="isa-serial"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       </target>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </serial>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <console type="pty">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <log file="/var/lib/nova/instances/1d1f8313-3681-4a88-9aef-3c69f49aaa19/console.log" append="off"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <target type="serial" port="0"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </console>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <input type="tablet" bus="usb">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="usb" bus="0" port="1"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </input>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <input type="mouse" bus="ps2"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <listen type="address" address="::"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </graphics>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <video>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <model type="virtio" heads="1" primary="yes"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </video>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <stats period="10"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </memballoon>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     <rng model="virtio">
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <backend model="random">/dev/urandom</backend>
Jan 26 15:28:13 compute-1 nova_compute[183403]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]:     </rng>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   </devices>
Jan 26 15:28:13 compute-1 nova_compute[183403]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Jan 26 15:28:13 compute-1 nova_compute[183403]: </domain>
Jan 26 15:28:13 compute-1 nova_compute[183403]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Jan 26 15:28:13 compute-1 nova_compute[183403]: 2026-01-26 15:28:13.015 183407 DEBUG nova.virt.libvirt.driver [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Jan 26 15:28:13 compute-1 nova_compute[183403]: 2026-01-26 15:28:13.127 183407 DEBUG nova.network.neutron [req-9d9e072c-174c-4468-a886-4dc8e09e4980 req-b543eaeb-02d0-4385-b65f-6816ad479615 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Updated VIF entry in instance network info cache for port 0cda9ffa-32fa-4511-a584-31b46b813df6. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Jan 26 15:28:13 compute-1 nova_compute[183403]: 2026-01-26 15:28:13.128 183407 DEBUG nova.network.neutron [req-9d9e072c-174c-4468-a886-4dc8e09e4980 req-b543eaeb-02d0-4385-b65f-6816ad479615 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Updating instance_info_cache with network_info: [{"id": "0cda9ffa-32fa-4511-a584-31b46b813df6", "address": "fa:16:3e:e5:ee:5d", "network": {"id": "567e8645-0094-48f0-9603-67223f9e4c7a", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1901535032-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4997f838a71499eb0b82dabfe381bfe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0cda9ffa-32", "ovs_interfaceid": "0cda9ffa-32fa-4511-a584-31b46b813df6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:28:13 compute-1 nova_compute[183403]: 2026-01-26 15:28:13.501 183407 DEBUG nova.virt.libvirt.migration [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Current None elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Jan 26 15:28:13 compute-1 nova_compute[183403]: 2026-01-26 15:28:13.501 183407 INFO nova.virt.libvirt.migration [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Increasing downtime to 50 ms after 0 sec elapsed time
Jan 26 15:28:13 compute-1 nova_compute[183403]: 2026-01-26 15:28:13.636 183407 DEBUG oslo_concurrency.lockutils [req-9d9e072c-174c-4468-a886-4dc8e09e4980 req-b543eaeb-02d0-4385-b65f-6816ad479615 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Releasing lock "refresh_cache-1d1f8313-3681-4a88-9aef-3c69f49aaa19" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 15:28:14 compute-1 nova_compute[183403]: 2026-01-26 15:28:14.471 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:28:14 compute-1 nova_compute[183403]: 2026-01-26 15:28:14.522 183407 INFO nova.virt.libvirt.driver [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Jan 26 15:28:15 compute-1 nova_compute[183403]: 2026-01-26 15:28:15.027 183407 DEBUG nova.virt.libvirt.migration [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Jan 26 15:28:15 compute-1 nova_compute[183403]: 2026-01-26 15:28:15.027 183407 DEBUG nova.virt.libvirt.migration [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Jan 26 15:28:15 compute-1 kernel: tap0cda9ffa-32 (unregistering): left promiscuous mode
Jan 26 15:28:15 compute-1 NetworkManager[55716]: <info>  [1769441295.3517] device (tap0cda9ffa-32): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 15:28:15 compute-1 ovn_controller[95641]: 2026-01-26T15:28:15Z|00169|binding|INFO|Releasing lport 0cda9ffa-32fa-4511-a584-31b46b813df6 from this chassis (sb_readonly=0)
Jan 26 15:28:15 compute-1 ovn_controller[95641]: 2026-01-26T15:28:15Z|00170|binding|INFO|Setting lport 0cda9ffa-32fa-4511-a584-31b46b813df6 down in Southbound
Jan 26 15:28:15 compute-1 nova_compute[183403]: 2026-01-26 15:28:15.362 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:28:15 compute-1 ovn_controller[95641]: 2026-01-26T15:28:15Z|00171|binding|INFO|Removing iface tap0cda9ffa-32 ovn-installed in OVS
Jan 26 15:28:15 compute-1 nova_compute[183403]: 2026-01-26 15:28:15.364 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:28:15 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:28:15.389 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e5:ee:5d 10.100.0.12'], port_security=['fa:16:3e:e5:ee:5d 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '3e0272b2-d627-4653-a221-12286e3af322'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '1d1f8313-3681-4a88-9aef-3c69f49aaa19', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-567e8645-0094-48f0-9603-67223f9e4c7a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ab5cf25b2abc42399ccb7131f5e1e913', 'neutron:revision_number': '10', 'neutron:security_group_ids': '42fdb3ac-2f55-4285-975e-aaa7d141bc66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3c086ead-9989-49f1-93e0-00527766eebe, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], logical_port=0cda9ffa-32fa-4511-a584-31b46b813df6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:28:15 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:28:15.391 104930 INFO neutron.agent.ovn.metadata.agent [-] Port 0cda9ffa-32fa-4511-a584-31b46b813df6 in datapath 567e8645-0094-48f0-9603-67223f9e4c7a unbound from our chassis
Jan 26 15:28:15 compute-1 nova_compute[183403]: 2026-01-26 15:28:15.392 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:28:15 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:28:15.393 104930 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 567e8645-0094-48f0-9603-67223f9e4c7a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 15:28:15 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:28:15.396 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[0a67a148-5897-4027-927d-ff69acd8b7a8]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:28:15 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:28:15.396 104930 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-567e8645-0094-48f0-9603-67223f9e4c7a namespace which is not needed anymore
Jan 26 15:28:15 compute-1 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000015.scope: Deactivated successfully.
Jan 26 15:28:15 compute-1 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000015.scope: Consumed 15.184s CPU time.
Jan 26 15:28:15 compute-1 systemd-machined[154697]: Machine qemu-15-instance-00000015 terminated.
Jan 26 15:28:15 compute-1 nova_compute[183403]: 2026-01-26 15:28:15.577 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:28:15 compute-1 nova_compute[183403]: 2026-01-26 15:28:15.602 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:28:15 compute-1 neutron-haproxy-ovnmeta-567e8645-0094-48f0-9603-67223f9e4c7a[211344]: [NOTICE]   (211348) : haproxy version is 3.0.5-8e879a5
Jan 26 15:28:15 compute-1 neutron-haproxy-ovnmeta-567e8645-0094-48f0-9603-67223f9e4c7a[211344]: [NOTICE]   (211348) : path to executable is /usr/sbin/haproxy
Jan 26 15:28:15 compute-1 neutron-haproxy-ovnmeta-567e8645-0094-48f0-9603-67223f9e4c7a[211344]: [WARNING]  (211348) : Exiting Master process...
Jan 26 15:28:15 compute-1 podman[211562]: 2026-01-26 15:28:15.610949374 +0000 UTC m=+0.099207282 container kill 0979b1573bc7ec96131bcd5320d71176ec08239accc4e0539fed8499a8e648ec (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-567e8645-0094-48f0-9603-67223f9e4c7a, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260120, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2)
Jan 26 15:28:15 compute-1 neutron-haproxy-ovnmeta-567e8645-0094-48f0-9603-67223f9e4c7a[211344]: [ALERT]    (211348) : Current worker (211350) exited with code 143 (Terminated)
Jan 26 15:28:15 compute-1 neutron-haproxy-ovnmeta-567e8645-0094-48f0-9603-67223f9e4c7a[211344]: [WARNING]  (211348) : All workers exited. Exiting... (0)
Jan 26 15:28:15 compute-1 systemd[1]: libpod-0979b1573bc7ec96131bcd5320d71176ec08239accc4e0539fed8499a8e648ec.scope: Deactivated successfully.
Jan 26 15:28:15 compute-1 nova_compute[183403]: 2026-01-26 15:28:15.654 183407 DEBUG nova.virt.libvirt.guest [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Jan 26 15:28:15 compute-1 nova_compute[183403]: 2026-01-26 15:28:15.655 183407 INFO nova.virt.libvirt.driver [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Migration operation has completed
Jan 26 15:28:15 compute-1 nova_compute[183403]: 2026-01-26 15:28:15.655 183407 INFO nova.compute.manager [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] _post_live_migration() is started..
Jan 26 15:28:15 compute-1 nova_compute[183403]: 2026-01-26 15:28:15.659 183407 DEBUG nova.virt.libvirt.driver [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Jan 26 15:28:15 compute-1 nova_compute[183403]: 2026-01-26 15:28:15.660 183407 DEBUG nova.virt.libvirt.driver [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Jan 26 15:28:15 compute-1 nova_compute[183403]: 2026-01-26 15:28:15.660 183407 DEBUG nova.virt.libvirt.driver [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Jan 26 15:28:15 compute-1 nova_compute[183403]: 2026-01-26 15:28:15.685 183407 WARNING neutronclient.v2_0.client [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:28:15 compute-1 nova_compute[183403]: 2026-01-26 15:28:15.686 183407 WARNING neutronclient.v2_0.client [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:28:15 compute-1 nova_compute[183403]: 2026-01-26 15:28:15.963 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:28:16 compute-1 podman[211590]: 2026-01-26 15:28:16.01001791 +0000 UTC m=+0.374414707 container died 0979b1573bc7ec96131bcd5320d71176ec08239accc4e0539fed8499a8e648ec (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-567e8645-0094-48f0-9603-67223f9e4c7a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260120)
Jan 26 15:28:16 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0979b1573bc7ec96131bcd5320d71176ec08239accc4e0539fed8499a8e648ec-userdata-shm.mount: Deactivated successfully.
Jan 26 15:28:16 compute-1 systemd[1]: var-lib-containers-storage-overlay-cce6bc5f7f94890e68982a3647763c8fc0e8213c3a852cd0f725f989e8997169-merged.mount: Deactivated successfully.
Jan 26 15:28:16 compute-1 podman[211590]: 2026-01-26 15:28:16.067475389 +0000 UTC m=+0.431872136 container cleanup 0979b1573bc7ec96131bcd5320d71176ec08239accc4e0539fed8499a8e648ec (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-567e8645-0094-48f0-9603-67223f9e4c7a, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 26 15:28:16 compute-1 systemd[1]: libpod-conmon-0979b1573bc7ec96131bcd5320d71176ec08239accc4e0539fed8499a8e648ec.scope: Deactivated successfully.
Jan 26 15:28:16 compute-1 podman[211607]: 2026-01-26 15:28:16.090969696 +0000 UTC m=+0.102334097 container remove 0979b1573bc7ec96131bcd5320d71176ec08239accc4e0539fed8499a8e648ec (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-567e8645-0094-48f0-9603-67223f9e4c7a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Jan 26 15:28:16 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:28:16.118 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[647688c2-d073-4667-b58e-87e07c8de558]: (4, ("Mon Jan 26 03:28:15 PM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-567e8645-0094-48f0-9603-67223f9e4c7a (0979b1573bc7ec96131bcd5320d71176ec08239accc4e0539fed8499a8e648ec)\n0979b1573bc7ec96131bcd5320d71176ec08239accc4e0539fed8499a8e648ec\nMon Jan 26 03:28:15 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-567e8645-0094-48f0-9603-67223f9e4c7a (0979b1573bc7ec96131bcd5320d71176ec08239accc4e0539fed8499a8e648ec)\n0979b1573bc7ec96131bcd5320d71176ec08239accc4e0539fed8499a8e648ec\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:28:16 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:28:16.121 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[20b23082-2e97-4e91-86bf-938fb56bc7a9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:28:16 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:28:16.122 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/567e8645-0094-48f0-9603-67223f9e4c7a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/567e8645-0094-48f0-9603-67223f9e4c7a.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:28:16 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:28:16.122 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[4100c2a1-ca8b-483c-bbb9-ecba247fb0e1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:28:16 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:28:16.123 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap567e8645-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:28:16 compute-1 nova_compute[183403]: 2026-01-26 15:28:16.125 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:28:16 compute-1 kernel: tap567e8645-00: left promiscuous mode
Jan 26 15:28:16 compute-1 nova_compute[183403]: 2026-01-26 15:28:16.154 183407 DEBUG nova.compute.manager [req-e5c39dd1-a1a8-4213-aa64-13627f15a273 req-e4bd0698-d35a-4fda-b7fb-386e4119b7a8 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Received event network-vif-unplugged-0cda9ffa-32fa-4511-a584-31b46b813df6 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:28:16 compute-1 nova_compute[183403]: 2026-01-26 15:28:16.154 183407 DEBUG oslo_concurrency.lockutils [req-e5c39dd1-a1a8-4213-aa64-13627f15a273 req-e4bd0698-d35a-4fda-b7fb-386e4119b7a8 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "1d1f8313-3681-4a88-9aef-3c69f49aaa19-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:28:16 compute-1 nova_compute[183403]: 2026-01-26 15:28:16.154 183407 DEBUG oslo_concurrency.lockutils [req-e5c39dd1-a1a8-4213-aa64-13627f15a273 req-e4bd0698-d35a-4fda-b7fb-386e4119b7a8 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "1d1f8313-3681-4a88-9aef-3c69f49aaa19-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:28:16 compute-1 nova_compute[183403]: 2026-01-26 15:28:16.154 183407 DEBUG oslo_concurrency.lockutils [req-e5c39dd1-a1a8-4213-aa64-13627f15a273 req-e4bd0698-d35a-4fda-b7fb-386e4119b7a8 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "1d1f8313-3681-4a88-9aef-3c69f49aaa19-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:28:16 compute-1 nova_compute[183403]: 2026-01-26 15:28:16.155 183407 DEBUG nova.compute.manager [req-e5c39dd1-a1a8-4213-aa64-13627f15a273 req-e4bd0698-d35a-4fda-b7fb-386e4119b7a8 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] No waiting events found dispatching network-vif-unplugged-0cda9ffa-32fa-4511-a584-31b46b813df6 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:28:16 compute-1 nova_compute[183403]: 2026-01-26 15:28:16.155 183407 DEBUG nova.compute.manager [req-e5c39dd1-a1a8-4213-aa64-13627f15a273 req-e4bd0698-d35a-4fda-b7fb-386e4119b7a8 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Received event network-vif-unplugged-0cda9ffa-32fa-4511-a584-31b46b813df6 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 15:28:16 compute-1 nova_compute[183403]: 2026-01-26 15:28:16.155 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:28:16 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:28:16.157 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[d4448821-72c3-4041-a89b-8fd13f4539d5]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:28:16 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:28:16.175 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[f7c1aa63-8e33-43b8-be8c-c15c234a75d9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:28:16 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:28:16.177 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[f54f9dd0-345c-46fb-b3a9-a0956fa95e0a]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:28:16 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:28:16.200 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[6a3a10d9-e0e7-4761-bf44-02f13f3a419a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 496580, 'reachable_time': 44108, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211629, 'error': None, 'target': 'ovnmeta-567e8645-0094-48f0-9603-67223f9e4c7a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:28:16 compute-1 systemd[1]: run-netns-ovnmeta\x2d567e8645\x2d0094\x2d48f0\x2d9603\x2d67223f9e4c7a.mount: Deactivated successfully.
Jan 26 15:28:16 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:28:16.203 105448 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-567e8645-0094-48f0-9603-67223f9e4c7a deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Jan 26 15:28:16 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:28:16.204 105448 DEBUG oslo.privsep.daemon [-] privsep: reply[20565605-d618-489f-941d-22a68ca51520]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:28:16 compute-1 nova_compute[183403]: 2026-01-26 15:28:16.338 183407 DEBUG nova.network.neutron [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Activated binding for port 0cda9ffa-32fa-4511-a584-31b46b813df6 and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Jan 26 15:28:16 compute-1 nova_compute[183403]: 2026-01-26 15:28:16.339 183407 DEBUG nova.compute.manager [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "0cda9ffa-32fa-4511-a584-31b46b813df6", "address": "fa:16:3e:e5:ee:5d", "network": {"id": "567e8645-0094-48f0-9603-67223f9e4c7a", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1901535032-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4997f838a71499eb0b82dabfe381bfe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0cda9ffa-32", "ovs_interfaceid": "0cda9ffa-32fa-4511-a584-31b46b813df6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10063
Jan 26 15:28:16 compute-1 nova_compute[183403]: 2026-01-26 15:28:16.340 183407 DEBUG nova.virt.libvirt.vif [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2026-01-26T15:27:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-1195854132',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-1195854132',id=21,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T15:27:24Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ab5cf25b2abc42399ccb7131f5e1e913',ramdisk_id='',reservation_id='r-vqqlw3za',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,reader,member,admin',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1308243436',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1308243436-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T15:27:52Z,user_data=None,user_id='ce151e26874c4c369f13ecc08f41d47f',uuid=1d1f8313-3681-4a88-9aef-3c69f49aaa19,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0cda9ffa-32fa-4511-a584-31b46b813df6", "address": "fa:16:3e:e5:ee:5d", "network": {"id": "567e8645-0094-48f0-9603-67223f9e4c7a", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1901535032-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4997f838a71499eb0b82dabfe381bfe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0cda9ffa-32", "ovs_interfaceid": "0cda9ffa-32fa-4511-a584-31b46b813df6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 26 15:28:16 compute-1 nova_compute[183403]: 2026-01-26 15:28:16.341 183407 DEBUG nova.network.os_vif_util [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Converting VIF {"id": "0cda9ffa-32fa-4511-a584-31b46b813df6", "address": "fa:16:3e:e5:ee:5d", "network": {"id": "567e8645-0094-48f0-9603-67223f9e4c7a", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1901535032-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4997f838a71499eb0b82dabfe381bfe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0cda9ffa-32", "ovs_interfaceid": "0cda9ffa-32fa-4511-a584-31b46b813df6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:28:16 compute-1 nova_compute[183403]: 2026-01-26 15:28:16.342 183407 DEBUG nova.network.os_vif_util [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e5:ee:5d,bridge_name='br-int',has_traffic_filtering=True,id=0cda9ffa-32fa-4511-a584-31b46b813df6,network=Network(567e8645-0094-48f0-9603-67223f9e4c7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0cda9ffa-32') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:28:16 compute-1 nova_compute[183403]: 2026-01-26 15:28:16.342 183407 DEBUG os_vif [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:ee:5d,bridge_name='br-int',has_traffic_filtering=True,id=0cda9ffa-32fa-4511-a584-31b46b813df6,network=Network(567e8645-0094-48f0-9603-67223f9e4c7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0cda9ffa-32') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 26 15:28:16 compute-1 nova_compute[183403]: 2026-01-26 15:28:16.345 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:28:16 compute-1 nova_compute[183403]: 2026-01-26 15:28:16.346 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0cda9ffa-32, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:28:16 compute-1 nova_compute[183403]: 2026-01-26 15:28:16.348 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:28:16 compute-1 nova_compute[183403]: 2026-01-26 15:28:16.350 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:28:16 compute-1 nova_compute[183403]: 2026-01-26 15:28:16.351 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:28:16 compute-1 nova_compute[183403]: 2026-01-26 15:28:16.352 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=0b66a896-4604-4e26-8ac0-2fadb72b4122) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:28:16 compute-1 nova_compute[183403]: 2026-01-26 15:28:16.353 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:28:16 compute-1 nova_compute[183403]: 2026-01-26 15:28:16.354 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:28:16 compute-1 nova_compute[183403]: 2026-01-26 15:28:16.361 183407 INFO os_vif [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:ee:5d,bridge_name='br-int',has_traffic_filtering=True,id=0cda9ffa-32fa-4511-a584-31b46b813df6,network=Network(567e8645-0094-48f0-9603-67223f9e4c7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0cda9ffa-32')
Jan 26 15:28:16 compute-1 nova_compute[183403]: 2026-01-26 15:28:16.362 183407 DEBUG oslo_concurrency.lockutils [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:28:16 compute-1 nova_compute[183403]: 2026-01-26 15:28:16.362 183407 DEBUG oslo_concurrency.lockutils [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:28:16 compute-1 nova_compute[183403]: 2026-01-26 15:28:16.363 183407 DEBUG oslo_concurrency.lockutils [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:28:16 compute-1 nova_compute[183403]: 2026-01-26 15:28:16.363 183407 DEBUG nova.compute.manager [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10086
Jan 26 15:28:16 compute-1 nova_compute[183403]: 2026-01-26 15:28:16.364 183407 INFO nova.virt.libvirt.driver [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Deleting instance files /var/lib/nova/instances/1d1f8313-3681-4a88-9aef-3c69f49aaa19_del
Jan 26 15:28:16 compute-1 nova_compute[183403]: 2026-01-26 15:28:16.365 183407 INFO nova.virt.libvirt.driver [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Deletion of /var/lib/nova/instances/1d1f8313-3681-4a88-9aef-3c69f49aaa19_del complete
Jan 26 15:28:16 compute-1 nova_compute[183403]: 2026-01-26 15:28:16.575 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:28:18 compute-1 nova_compute[183403]: 2026-01-26 15:28:18.217 183407 DEBUG nova.compute.manager [req-ca79d46c-5315-472e-9f60-195e65d29485 req-4b1b4319-76e4-46f9-9654-4e25260097ba 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Received event network-vif-plugged-0cda9ffa-32fa-4511-a584-31b46b813df6 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:28:18 compute-1 nova_compute[183403]: 2026-01-26 15:28:18.217 183407 DEBUG oslo_concurrency.lockutils [req-ca79d46c-5315-472e-9f60-195e65d29485 req-4b1b4319-76e4-46f9-9654-4e25260097ba 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "1d1f8313-3681-4a88-9aef-3c69f49aaa19-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:28:18 compute-1 nova_compute[183403]: 2026-01-26 15:28:18.218 183407 DEBUG oslo_concurrency.lockutils [req-ca79d46c-5315-472e-9f60-195e65d29485 req-4b1b4319-76e4-46f9-9654-4e25260097ba 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "1d1f8313-3681-4a88-9aef-3c69f49aaa19-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:28:18 compute-1 nova_compute[183403]: 2026-01-26 15:28:18.218 183407 DEBUG oslo_concurrency.lockutils [req-ca79d46c-5315-472e-9f60-195e65d29485 req-4b1b4319-76e4-46f9-9654-4e25260097ba 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "1d1f8313-3681-4a88-9aef-3c69f49aaa19-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:28:18 compute-1 nova_compute[183403]: 2026-01-26 15:28:18.218 183407 DEBUG nova.compute.manager [req-ca79d46c-5315-472e-9f60-195e65d29485 req-4b1b4319-76e4-46f9-9654-4e25260097ba 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] No waiting events found dispatching network-vif-plugged-0cda9ffa-32fa-4511-a584-31b46b813df6 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:28:18 compute-1 nova_compute[183403]: 2026-01-26 15:28:18.219 183407 WARNING nova.compute.manager [req-ca79d46c-5315-472e-9f60-195e65d29485 req-4b1b4319-76e4-46f9-9654-4e25260097ba 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Received unexpected event network-vif-plugged-0cda9ffa-32fa-4511-a584-31b46b813df6 for instance with vm_state active and task_state migrating.
Jan 26 15:28:18 compute-1 nova_compute[183403]: 2026-01-26 15:28:18.219 183407 DEBUG nova.compute.manager [req-ca79d46c-5315-472e-9f60-195e65d29485 req-4b1b4319-76e4-46f9-9654-4e25260097ba 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Received event network-vif-unplugged-0cda9ffa-32fa-4511-a584-31b46b813df6 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:28:18 compute-1 nova_compute[183403]: 2026-01-26 15:28:18.219 183407 DEBUG oslo_concurrency.lockutils [req-ca79d46c-5315-472e-9f60-195e65d29485 req-4b1b4319-76e4-46f9-9654-4e25260097ba 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "1d1f8313-3681-4a88-9aef-3c69f49aaa19-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:28:18 compute-1 nova_compute[183403]: 2026-01-26 15:28:18.219 183407 DEBUG oslo_concurrency.lockutils [req-ca79d46c-5315-472e-9f60-195e65d29485 req-4b1b4319-76e4-46f9-9654-4e25260097ba 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "1d1f8313-3681-4a88-9aef-3c69f49aaa19-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:28:18 compute-1 nova_compute[183403]: 2026-01-26 15:28:18.220 183407 DEBUG oslo_concurrency.lockutils [req-ca79d46c-5315-472e-9f60-195e65d29485 req-4b1b4319-76e4-46f9-9654-4e25260097ba 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "1d1f8313-3681-4a88-9aef-3c69f49aaa19-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:28:18 compute-1 nova_compute[183403]: 2026-01-26 15:28:18.220 183407 DEBUG nova.compute.manager [req-ca79d46c-5315-472e-9f60-195e65d29485 req-4b1b4319-76e4-46f9-9654-4e25260097ba 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] No waiting events found dispatching network-vif-unplugged-0cda9ffa-32fa-4511-a584-31b46b813df6 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:28:18 compute-1 nova_compute[183403]: 2026-01-26 15:28:18.220 183407 DEBUG nova.compute.manager [req-ca79d46c-5315-472e-9f60-195e65d29485 req-4b1b4319-76e4-46f9-9654-4e25260097ba 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Received event network-vif-unplugged-0cda9ffa-32fa-4511-a584-31b46b813df6 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 15:28:18 compute-1 nova_compute[183403]: 2026-01-26 15:28:18.220 183407 DEBUG nova.compute.manager [req-ca79d46c-5315-472e-9f60-195e65d29485 req-4b1b4319-76e4-46f9-9654-4e25260097ba 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Received event network-vif-plugged-0cda9ffa-32fa-4511-a584-31b46b813df6 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:28:18 compute-1 nova_compute[183403]: 2026-01-26 15:28:18.221 183407 DEBUG oslo_concurrency.lockutils [req-ca79d46c-5315-472e-9f60-195e65d29485 req-4b1b4319-76e4-46f9-9654-4e25260097ba 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "1d1f8313-3681-4a88-9aef-3c69f49aaa19-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:28:18 compute-1 nova_compute[183403]: 2026-01-26 15:28:18.221 183407 DEBUG oslo_concurrency.lockutils [req-ca79d46c-5315-472e-9f60-195e65d29485 req-4b1b4319-76e4-46f9-9654-4e25260097ba 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "1d1f8313-3681-4a88-9aef-3c69f49aaa19-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:28:18 compute-1 nova_compute[183403]: 2026-01-26 15:28:18.221 183407 DEBUG oslo_concurrency.lockutils [req-ca79d46c-5315-472e-9f60-195e65d29485 req-4b1b4319-76e4-46f9-9654-4e25260097ba 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "1d1f8313-3681-4a88-9aef-3c69f49aaa19-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:28:18 compute-1 nova_compute[183403]: 2026-01-26 15:28:18.221 183407 DEBUG nova.compute.manager [req-ca79d46c-5315-472e-9f60-195e65d29485 req-4b1b4319-76e4-46f9-9654-4e25260097ba 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] No waiting events found dispatching network-vif-plugged-0cda9ffa-32fa-4511-a584-31b46b813df6 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:28:18 compute-1 nova_compute[183403]: 2026-01-26 15:28:18.222 183407 WARNING nova.compute.manager [req-ca79d46c-5315-472e-9f60-195e65d29485 req-4b1b4319-76e4-46f9-9654-4e25260097ba 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Received unexpected event network-vif-plugged-0cda9ffa-32fa-4511-a584-31b46b813df6 for instance with vm_state active and task_state migrating.
Jan 26 15:28:18 compute-1 podman[211632]: 2026-01-26 15:28:18.92152977 +0000 UTC m=+0.081511672 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, config_id=openstack_network_exporter, io.openshift.expose-services=, vcs-type=git, com.redhat.component=ubi9-minimal-container, release=1755695350)
Jan 26 15:28:18 compute-1 podman[211631]: 2026-01-26 15:28:18.921661634 +0000 UTC m=+0.095044599 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 15:28:19 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:28:19.381 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=41380b5a-e321-4ce4-bcc6-ecd563b3c793, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:28:19 compute-1 openstack_network_exporter[195610]: ERROR   15:28:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:28:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:28:19 compute-1 openstack_network_exporter[195610]: ERROR   15:28:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:28:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:28:20 compute-1 nova_compute[183403]: 2026-01-26 15:28:20.964 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:28:21 compute-1 nova_compute[183403]: 2026-01-26 15:28:21.353 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:28:21 compute-1 nova_compute[183403]: 2026-01-26 15:28:21.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:28:21 compute-1 nova_compute[183403]: 2026-01-26 15:28:21.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:28:22 compute-1 nova_compute[183403]: 2026-01-26 15:28:22.093 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:28:22 compute-1 nova_compute[183403]: 2026-01-26 15:28:22.093 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:28:22 compute-1 nova_compute[183403]: 2026-01-26 15:28:22.093 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:28:22 compute-1 nova_compute[183403]: 2026-01-26 15:28:22.094 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:28:22 compute-1 nova_compute[183403]: 2026-01-26 15:28:22.334 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:28:22 compute-1 nova_compute[183403]: 2026-01-26 15:28:22.336 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:28:22 compute-1 nova_compute[183403]: 2026-01-26 15:28:22.366 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.030s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:28:22 compute-1 nova_compute[183403]: 2026-01-26 15:28:22.369 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5812MB free_disk=73.14459991455078GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:28:22 compute-1 nova_compute[183403]: 2026-01-26 15:28:22.369 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:28:22 compute-1 nova_compute[183403]: 2026-01-26 15:28:22.370 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:28:23 compute-1 nova_compute[183403]: 2026-01-26 15:28:23.397 183407 INFO nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Updating resource usage from migration 99e0ea5b-2d8c-46b9-ac63-0aa28bc2aae8
Jan 26 15:28:23 compute-1 nova_compute[183403]: 2026-01-26 15:28:23.430 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Migration 99e0ea5b-2d8c-46b9-ac63-0aa28bc2aae8 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Jan 26 15:28:23 compute-1 nova_compute[183403]: 2026-01-26 15:28:23.430 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:28:23 compute-1 nova_compute[183403]: 2026-01-26 15:28:23.430 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:28:22 up  1:23,  0 user,  load average: 0.42, 0.29, 0.29\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_migrating': '1', 'num_os_type_None': '1', 'num_proj_ab5cf25b2abc42399ccb7131f5e1e913': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:28:23 compute-1 nova_compute[183403]: 2026-01-26 15:28:23.495 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:28:24 compute-1 nova_compute[183403]: 2026-01-26 15:28:24.004 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:28:24 compute-1 nova_compute[183403]: 2026-01-26 15:28:24.515 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:28:24 compute-1 nova_compute[183403]: 2026-01-26 15:28:24.516 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.146s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:28:25 compute-1 nova_compute[183403]: 2026-01-26 15:28:25.516 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:28:25 compute-1 nova_compute[183403]: 2026-01-26 15:28:25.517 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:28:25 compute-1 nova_compute[183403]: 2026-01-26 15:28:25.517 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:28:25 compute-1 nova_compute[183403]: 2026-01-26 15:28:25.517 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 15:28:25 compute-1 nova_compute[183403]: 2026-01-26 15:28:25.577 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:28:25 compute-1 nova_compute[183403]: 2026-01-26 15:28:25.966 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:28:26 compute-1 nova_compute[183403]: 2026-01-26 15:28:26.355 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:28:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:28:29.076 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:28:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:28:29.077 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:28:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:28:29.077 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:28:29 compute-1 nova_compute[183403]: 2026-01-26 15:28:29.572 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:28:30 compute-1 nova_compute[183403]: 2026-01-26 15:28:29.999 183407 DEBUG oslo_concurrency.lockutils [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "1d1f8313-3681-4a88-9aef-3c69f49aaa19-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:28:30 compute-1 nova_compute[183403]: 2026-01-26 15:28:30.000 183407 DEBUG oslo_concurrency.lockutils [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "1d1f8313-3681-4a88-9aef-3c69f49aaa19-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:28:30 compute-1 nova_compute[183403]: 2026-01-26 15:28:30.001 183407 DEBUG oslo_concurrency.lockutils [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "1d1f8313-3681-4a88-9aef-3c69f49aaa19-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:28:30 compute-1 podman[211679]: 2026-01-26 15:28:30.018622258 +0000 UTC m=+0.188512845 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20260120)
Jan 26 15:28:30 compute-1 podman[211678]: 2026-01-26 15:28:30.062470397 +0000 UTC m=+0.232743054 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Jan 26 15:28:30 compute-1 nova_compute[183403]: 2026-01-26 15:28:30.968 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:28:31 compute-1 nova_compute[183403]: 2026-01-26 15:28:31.357 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:28:31 compute-1 nova_compute[183403]: 2026-01-26 15:28:31.425 183407 DEBUG oslo_concurrency.lockutils [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:28:31 compute-1 nova_compute[183403]: 2026-01-26 15:28:31.425 183407 DEBUG oslo_concurrency.lockutils [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:28:31 compute-1 nova_compute[183403]: 2026-01-26 15:28:31.426 183407 DEBUG oslo_concurrency.lockutils [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:28:31 compute-1 nova_compute[183403]: 2026-01-26 15:28:31.426 183407 DEBUG nova.compute.resource_tracker [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:28:31 compute-1 nova_compute[183403]: 2026-01-26 15:28:31.618 183407 WARNING nova.virt.libvirt.driver [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:28:31 compute-1 nova_compute[183403]: 2026-01-26 15:28:31.620 183407 DEBUG oslo_concurrency.processutils [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:28:31 compute-1 nova_compute[183403]: 2026-01-26 15:28:31.662 183407 DEBUG oslo_concurrency.processutils [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "env LANG=C uptime" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:28:31 compute-1 nova_compute[183403]: 2026-01-26 15:28:31.663 183407 DEBUG nova.compute.resource_tracker [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5825MB free_disk=73.14459991455078GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:28:31 compute-1 nova_compute[183403]: 2026-01-26 15:28:31.663 183407 DEBUG oslo_concurrency.lockutils [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:28:31 compute-1 nova_compute[183403]: 2026-01-26 15:28:31.664 183407 DEBUG oslo_concurrency.lockutils [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:28:33 compute-1 nova_compute[183403]: 2026-01-26 15:28:33.405 183407 DEBUG nova.compute.resource_tracker [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Migration for instance 1d1f8313-3681-4a88-9aef-3c69f49aaa19 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Jan 26 15:28:34 compute-1 nova_compute[183403]: 2026-01-26 15:28:34.566 183407 DEBUG nova.compute.resource_tracker [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Jan 26 15:28:34 compute-1 nova_compute[183403]: 2026-01-26 15:28:34.603 183407 DEBUG nova.compute.resource_tracker [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Migration 99e0ea5b-2d8c-46b9-ac63-0aa28bc2aae8 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Jan 26 15:28:34 compute-1 nova_compute[183403]: 2026-01-26 15:28:34.604 183407 DEBUG nova.compute.resource_tracker [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:28:34 compute-1 nova_compute[183403]: 2026-01-26 15:28:34.604 183407 DEBUG nova.compute.resource_tracker [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:28:31 up  1:23,  0 user,  load average: 0.43, 0.30, 0.29\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:28:34 compute-1 nova_compute[183403]: 2026-01-26 15:28:34.642 183407 DEBUG nova.compute.provider_tree [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:28:35 compute-1 nova_compute[183403]: 2026-01-26 15:28:35.195 183407 DEBUG nova.scheduler.client.report [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:28:35 compute-1 podman[192725]: time="2026-01-26T15:28:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:28:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:28:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:28:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:28:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2194 "" "Go-http-client/1.1"
Jan 26 15:28:35 compute-1 nova_compute[183403]: 2026-01-26 15:28:35.706 183407 DEBUG nova.compute.resource_tracker [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:28:35 compute-1 nova_compute[183403]: 2026-01-26 15:28:35.707 183407 DEBUG oslo_concurrency.lockutils [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.043s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:28:35 compute-1 nova_compute[183403]: 2026-01-26 15:28:35.733 183407 INFO nova.compute.manager [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Migrating instance to compute-0.ctlplane.example.com finished successfully.
Jan 26 15:28:35 compute-1 nova_compute[183403]: 2026-01-26 15:28:35.971 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:28:36 compute-1 nova_compute[183403]: 2026-01-26 15:28:36.359 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:28:36 compute-1 nova_compute[183403]: 2026-01-26 15:28:36.816 183407 INFO nova.scheduler.client.report [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Deleted allocation for migration 99e0ea5b-2d8c-46b9-ac63-0aa28bc2aae8
Jan 26 15:28:36 compute-1 nova_compute[183403]: 2026-01-26 15:28:36.817 183407 DEBUG nova.virt.libvirt.driver [None req-cd3e2641-dacf-4de1-964d-57d7083250f2 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 1d1f8313-3681-4a88-9aef-3c69f49aaa19] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Jan 26 15:28:40 compute-1 nova_compute[183403]: 2026-01-26 15:28:40.973 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:28:41 compute-1 nova_compute[183403]: 2026-01-26 15:28:41.361 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:28:45 compute-1 nova_compute[183403]: 2026-01-26 15:28:45.975 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:28:46 compute-1 nova_compute[183403]: 2026-01-26 15:28:46.363 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:28:49 compute-1 nova_compute[183403]: 2026-01-26 15:28:49.087 183407 DEBUG nova.compute.manager [None req-7f188a60-f24e-4986-9fd0-ef1d24089aea c36e2929624c484886f7858d405633e8 179f3c996d8f4e7ea1b0aca3ec76f02e - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:632
Jan 26 15:28:49 compute-1 nova_compute[183403]: 2026-01-26 15:28:49.139 183407 DEBUG nova.compute.provider_tree [None req-7f188a60-f24e-4986-9fd0-ef1d24089aea c36e2929624c484886f7858d405633e8 179f3c996d8f4e7ea1b0aca3ec76f02e - - default default] Updating resource provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 generation from 31 to 34 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Jan 26 15:28:49 compute-1 openstack_network_exporter[195610]: ERROR   15:28:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:28:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:28:49 compute-1 openstack_network_exporter[195610]: ERROR   15:28:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:28:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:28:49 compute-1 podman[211723]: 2026-01-26 15:28:49.909464281 +0000 UTC m=+0.082306503 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 15:28:49 compute-1 podman[211724]: 2026-01-26 15:28:49.918276381 +0000 UTC m=+0.088858712 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9-minimal, container_name=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, release=1755695350)
Jan 26 15:28:50 compute-1 nova_compute[183403]: 2026-01-26 15:28:50.977 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:28:51 compute-1 nova_compute[183403]: 2026-01-26 15:28:51.401 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:28:53 compute-1 nova_compute[183403]: 2026-01-26 15:28:53.266 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:28:55 compute-1 nova_compute[183403]: 2026-01-26 15:28:55.978 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:28:56 compute-1 nova_compute[183403]: 2026-01-26 15:28:56.404 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:29:00 compute-1 podman[211769]: 2026-01-26 15:29:00.922993563 +0000 UTC m=+0.086526178 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260120, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent)
Jan 26 15:29:00 compute-1 podman[211768]: 2026-01-26 15:29:00.98037929 +0000 UTC m=+0.149247570 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller)
Jan 26 15:29:00 compute-1 nova_compute[183403]: 2026-01-26 15:29:00.980 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:29:01 compute-1 nova_compute[183403]: 2026-01-26 15:29:01.405 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:29:04 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:29:04.074 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1a:8f:ee 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-d7bb9409-21ac-404c-881a-401a33317e0b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7bb9409-21ac-404c-881a-401a33317e0b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ce5a22c9e1b44c8688bb5ce1d0d3ef81', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2fe2c46a-3344-4b49-9cc4-4db510e2e673, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=fe81a3ca-59e0-4072-a668-b5a59e5f3940) old=Port_Binding(mac=['fa:16:3e:1a:8f:ee'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-d7bb9409-21ac-404c-881a-401a33317e0b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7bb9409-21ac-404c-881a-401a33317e0b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ce5a22c9e1b44c8688bb5ce1d0d3ef81', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:29:04 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:29:04.075 104930 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port fe81a3ca-59e0-4072-a668-b5a59e5f3940 in datapath d7bb9409-21ac-404c-881a-401a33317e0b updated
Jan 26 15:29:04 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:29:04.076 104930 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d7bb9409-21ac-404c-881a-401a33317e0b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 15:29:04 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:29:04.078 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[fd70256f-9f8c-4202-9ca1-d13191a33922]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:29:05 compute-1 podman[192725]: time="2026-01-26T15:29:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:29:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:29:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:29:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:29:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2194 "" "Go-http-client/1.1"
Jan 26 15:29:06 compute-1 nova_compute[183403]: 2026-01-26 15:29:06.016 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:29:06 compute-1 nova_compute[183403]: 2026-01-26 15:29:06.407 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:29:11 compute-1 nova_compute[183403]: 2026-01-26 15:29:11.064 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:29:11 compute-1 nova_compute[183403]: 2026-01-26 15:29:11.410 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:29:13 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:29:13.260 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:ac:38', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'de:96:40:90:f3:e5'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:29:13 compute-1 nova_compute[183403]: 2026-01-26 15:29:13.261 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:29:13 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:29:13.261 104930 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 15:29:15 compute-1 nova_compute[183403]: 2026-01-26 15:29:15.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:29:16 compute-1 nova_compute[183403]: 2026-01-26 15:29:16.066 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:29:16 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:29:16.358 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b7:93:c5 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-de14de5d-43af-4bb1-a590-a6ae86d58e77', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-de14de5d-43af-4bb1-a590-a6ae86d58e77', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2c23d857cca949afb2559c9276298f2f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d426074b-f465-40f8-ade5-42fcf3362f84, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ebb97a27-72f6-4858-a3e1-4a08dd56d920) old=Port_Binding(mac=['fa:16:3e:b7:93:c5'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-de14de5d-43af-4bb1-a590-a6ae86d58e77', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-de14de5d-43af-4bb1-a590-a6ae86d58e77', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2c23d857cca949afb2559c9276298f2f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:29:16 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:29:16.359 104930 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ebb97a27-72f6-4858-a3e1-4a08dd56d920 in datapath de14de5d-43af-4bb1-a590-a6ae86d58e77 updated
Jan 26 15:29:16 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:29:16.360 104930 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network de14de5d-43af-4bb1-a590-a6ae86d58e77, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 15:29:16 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:29:16.361 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[d56c9b96-be02-44a7-b1f5-8ded6848992e]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:29:16 compute-1 nova_compute[183403]: 2026-01-26 15:29:16.412 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:29:16 compute-1 nova_compute[183403]: 2026-01-26 15:29:16.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:29:19 compute-1 openstack_network_exporter[195610]: ERROR   15:29:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:29:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:29:19 compute-1 openstack_network_exporter[195610]: ERROR   15:29:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:29:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:29:20 compute-1 podman[211815]: 2026-01-26 15:29:20.892385105 +0000 UTC m=+0.067713074 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 26 15:29:20 compute-1 podman[211816]: 2026-01-26 15:29:20.912873628 +0000 UTC m=+0.084064115 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, architecture=x86_64, config_id=openstack_network_exporter, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, vcs-type=git, release=1755695350, version=9.6, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, maintainer=Red Hat, Inc.)
Jan 26 15:29:21 compute-1 nova_compute[183403]: 2026-01-26 15:29:21.068 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:29:21 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:29:21.263 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=41380b5a-e321-4ce4-bcc6-ecd563b3c793, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:29:21 compute-1 nova_compute[183403]: 2026-01-26 15:29:21.414 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:29:23 compute-1 nova_compute[183403]: 2026-01-26 15:29:23.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:29:23 compute-1 nova_compute[183403]: 2026-01-26 15:29:23.577 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:29:24 compute-1 nova_compute[183403]: 2026-01-26 15:29:24.092 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:29:24 compute-1 nova_compute[183403]: 2026-01-26 15:29:24.093 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:29:24 compute-1 nova_compute[183403]: 2026-01-26 15:29:24.093 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:29:24 compute-1 nova_compute[183403]: 2026-01-26 15:29:24.093 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:29:24 compute-1 nova_compute[183403]: 2026-01-26 15:29:24.295 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:29:24 compute-1 nova_compute[183403]: 2026-01-26 15:29:24.297 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:29:24 compute-1 nova_compute[183403]: 2026-01-26 15:29:24.337 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.040s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:29:24 compute-1 nova_compute[183403]: 2026-01-26 15:29:24.338 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5853MB free_disk=73.14490509033203GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:29:24 compute-1 nova_compute[183403]: 2026-01-26 15:29:24.338 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:29:24 compute-1 nova_compute[183403]: 2026-01-26 15:29:24.339 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:29:25 compute-1 nova_compute[183403]: 2026-01-26 15:29:25.392 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:29:25 compute-1 nova_compute[183403]: 2026-01-26 15:29:25.393 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:29:24 up  1:24,  0 user,  load average: 0.17, 0.25, 0.27\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:29:25 compute-1 nova_compute[183403]: 2026-01-26 15:29:25.419 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:29:25 compute-1 nova_compute[183403]: 2026-01-26 15:29:25.927 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:29:26 compute-1 nova_compute[183403]: 2026-01-26 15:29:26.131 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:29:26 compute-1 ovn_controller[95641]: 2026-01-26T15:29:26Z|00172|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Jan 26 15:29:26 compute-1 nova_compute[183403]: 2026-01-26 15:29:26.417 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:29:26 compute-1 nova_compute[183403]: 2026-01-26 15:29:26.439 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:29:26 compute-1 nova_compute[183403]: 2026-01-26 15:29:26.439 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.100s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:29:27 compute-1 nova_compute[183403]: 2026-01-26 15:29:27.438 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:29:27 compute-1 nova_compute[183403]: 2026-01-26 15:29:27.439 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:29:27 compute-1 nova_compute[183403]: 2026-01-26 15:29:27.440 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:29:27 compute-1 nova_compute[183403]: 2026-01-26 15:29:27.440 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 15:29:27 compute-1 nova_compute[183403]: 2026-01-26 15:29:27.577 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:29:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:29:29.078 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:29:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:29:29.078 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:29:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:29:29.079 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:29:31 compute-1 nova_compute[183403]: 2026-01-26 15:29:31.134 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:29:31 compute-1 nova_compute[183403]: 2026-01-26 15:29:31.419 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:29:31 compute-1 podman[211862]: 2026-01-26 15:29:31.915784594 +0000 UTC m=+0.084497078 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 15:29:31 compute-1 podman[211861]: 2026-01-26 15:29:31.953271189 +0000 UTC m=+0.124354766 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260120, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 15:29:35 compute-1 podman[192725]: time="2026-01-26T15:29:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:29:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:29:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:29:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:29:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2195 "" "Go-http-client/1.1"
Jan 26 15:29:36 compute-1 nova_compute[183403]: 2026-01-26 15:29:36.135 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:29:36 compute-1 nova_compute[183403]: 2026-01-26 15:29:36.421 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:29:41 compute-1 nova_compute[183403]: 2026-01-26 15:29:41.137 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:29:41 compute-1 nova_compute[183403]: 2026-01-26 15:29:41.423 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:29:46 compute-1 nova_compute[183403]: 2026-01-26 15:29:46.140 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:29:46 compute-1 nova_compute[183403]: 2026-01-26 15:29:46.425 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:29:49 compute-1 openstack_network_exporter[195610]: ERROR   15:29:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:29:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:29:49 compute-1 openstack_network_exporter[195610]: ERROR   15:29:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:29:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:29:51 compute-1 nova_compute[183403]: 2026-01-26 15:29:51.141 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:29:51 compute-1 nova_compute[183403]: 2026-01-26 15:29:51.427 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:29:51 compute-1 podman[211908]: 2026-01-26 15:29:51.920601903 +0000 UTC m=+0.074762924 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, architecture=x86_64, io.openshift.expose-services=, name=ubi9-minimal, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.tags=minimal rhel9, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., managed_by=edpm_ansible)
Jan 26 15:29:51 compute-1 podman[211907]: 2026-01-26 15:29:51.920876431 +0000 UTC m=+0.079376759 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 15:29:56 compute-1 nova_compute[183403]: 2026-01-26 15:29:56.145 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:29:56 compute-1 nova_compute[183403]: 2026-01-26 15:29:56.430 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:30:01 compute-1 nova_compute[183403]: 2026-01-26 15:30:01.147 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:30:01 compute-1 nova_compute[183403]: 2026-01-26 15:30:01.433 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:30:02 compute-1 podman[211949]: 2026-01-26 15:30:02.884425931 +0000 UTC m=+0.063832678 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 26 15:30:02 compute-1 podman[211948]: 2026-01-26 15:30:02.978496617 +0000 UTC m=+0.162430817 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 15:30:05 compute-1 podman[192725]: time="2026-01-26T15:30:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:30:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:30:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:30:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:30:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2195 "" "Go-http-client/1.1"
Jan 26 15:30:06 compute-1 nova_compute[183403]: 2026-01-26 15:30:06.150 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:30:06 compute-1 nova_compute[183403]: 2026-01-26 15:30:06.435 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:30:11 compute-1 nova_compute[183403]: 2026-01-26 15:30:11.151 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:30:11 compute-1 nova_compute[183403]: 2026-01-26 15:30:11.437 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:30:16 compute-1 nova_compute[183403]: 2026-01-26 15:30:16.154 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:30:16 compute-1 nova_compute[183403]: 2026-01-26 15:30:16.440 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:30:16 compute-1 nova_compute[183403]: 2026-01-26 15:30:16.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:30:17 compute-1 nova_compute[183403]: 2026-01-26 15:30:17.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:30:19 compute-1 openstack_network_exporter[195610]: ERROR   15:30:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:30:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:30:19 compute-1 openstack_network_exporter[195610]: ERROR   15:30:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:30:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:30:21 compute-1 nova_compute[183403]: 2026-01-26 15:30:21.155 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:30:21 compute-1 nova_compute[183403]: 2026-01-26 15:30:21.441 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:30:22 compute-1 podman[211993]: 2026-01-26 15:30:22.885443179 +0000 UTC m=+0.060086317 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 15:30:22 compute-1 podman[211994]: 2026-01-26 15:30:22.887647199 +0000 UTC m=+0.063622974 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, name=ubi9-minimal, build-date=2025-08-20T13:12:41, distribution-scope=public, managed_by=edpm_ansible, vcs-type=git, version=9.6, architecture=x86_64, container_name=openstack_network_exporter, release=1755695350, config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=)
Jan 26 15:30:23 compute-1 nova_compute[183403]: 2026-01-26 15:30:23.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:30:23 compute-1 nova_compute[183403]: 2026-01-26 15:30:23.821 183407 DEBUG nova.virt.libvirt.driver [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 2239680b-85e2-414c-a1a8-69e506b41f64] Creating tmpfile /var/lib/nova/instances/tmp1q80k76o to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Jan 26 15:30:23 compute-1 nova_compute[183403]: 2026-01-26 15:30:23.822 183407 WARNING neutronclient.v2_0.client [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:30:23 compute-1 nova_compute[183403]: 2026-01-26 15:30:23.907 183407 DEBUG nova.compute.manager [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp1q80k76o',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9090
Jan 26 15:30:24 compute-1 nova_compute[183403]: 2026-01-26 15:30:24.090 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:30:24 compute-1 nova_compute[183403]: 2026-01-26 15:30:24.091 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:30:24 compute-1 nova_compute[183403]: 2026-01-26 15:30:24.091 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:30:24 compute-1 nova_compute[183403]: 2026-01-26 15:30:24.092 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:30:24 compute-1 nova_compute[183403]: 2026-01-26 15:30:24.316 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:30:24 compute-1 nova_compute[183403]: 2026-01-26 15:30:24.321 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:30:24 compute-1 nova_compute[183403]: 2026-01-26 15:30:24.347 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.027s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:30:24 compute-1 nova_compute[183403]: 2026-01-26 15:30:24.348 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5853MB free_disk=73.14501190185547GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:30:24 compute-1 nova_compute[183403]: 2026-01-26 15:30:24.348 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:30:24 compute-1 nova_compute[183403]: 2026-01-26 15:30:24.349 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:30:25 compute-1 nova_compute[183403]: 2026-01-26 15:30:25.897 183407 WARNING nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Instance 2239680b-85e2-414c-a1a8-69e506b41f64 has been moved to another host compute-0.ctlplane.example.com(compute-0.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Jan 26 15:30:25 compute-1 nova_compute[183403]: 2026-01-26 15:30:25.897 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:30:25 compute-1 nova_compute[183403]: 2026-01-26 15:30:25.898 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:30:24 up  1:25,  0 user,  load average: 0.06, 0.20, 0.25\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:30:25 compute-1 nova_compute[183403]: 2026-01-26 15:30:25.938 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:30:25 compute-1 nova_compute[183403]: 2026-01-26 15:30:25.945 183407 WARNING neutronclient.v2_0.client [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:30:26 compute-1 nova_compute[183403]: 2026-01-26 15:30:26.157 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:30:26 compute-1 nova_compute[183403]: 2026-01-26 15:30:26.443 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:30:26 compute-1 nova_compute[183403]: 2026-01-26 15:30:26.448 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:30:26 compute-1 nova_compute[183403]: 2026-01-26 15:30:26.967 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:30:26 compute-1 nova_compute[183403]: 2026-01-26 15:30:26.967 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.619s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:30:27 compute-1 nova_compute[183403]: 2026-01-26 15:30:27.968 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:30:27 compute-1 nova_compute[183403]: 2026-01-26 15:30:27.969 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:30:27 compute-1 nova_compute[183403]: 2026-01-26 15:30:27.969 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:30:27 compute-1 nova_compute[183403]: 2026-01-26 15:30:27.970 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:30:27 compute-1 nova_compute[183403]: 2026-01-26 15:30:27.970 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 15:30:28 compute-1 nova_compute[183403]: 2026-01-26 15:30:28.577 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:30:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:30:29.080 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:30:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:30:29.081 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:30:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:30:29.082 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:30:29 compute-1 nova_compute[183403]: 2026-01-26 15:30:29.571 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:30:29 compute-1 nova_compute[183403]: 2026-01-26 15:30:29.987 183407 DEBUG nova.compute.manager [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp1q80k76o',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='2239680b-85e2-414c-a1a8-69e506b41f64',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9315
Jan 26 15:30:31 compute-1 nova_compute[183403]: 2026-01-26 15:30:31.007 183407 DEBUG oslo_concurrency.lockutils [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "refresh_cache-2239680b-85e2-414c-a1a8-69e506b41f64" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 15:30:31 compute-1 nova_compute[183403]: 2026-01-26 15:30:31.008 183407 DEBUG oslo_concurrency.lockutils [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquired lock "refresh_cache-2239680b-85e2-414c-a1a8-69e506b41f64" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 15:30:31 compute-1 nova_compute[183403]: 2026-01-26 15:30:31.008 183407 DEBUG nova.network.neutron [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 2239680b-85e2-414c-a1a8-69e506b41f64] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 15:30:31 compute-1 nova_compute[183403]: 2026-01-26 15:30:31.161 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:30:31 compute-1 nova_compute[183403]: 2026-01-26 15:30:31.445 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:30:31 compute-1 nova_compute[183403]: 2026-01-26 15:30:31.583 183407 WARNING neutronclient.v2_0.client [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:30:32 compute-1 nova_compute[183403]: 2026-01-26 15:30:32.298 183407 WARNING neutronclient.v2_0.client [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:30:33 compute-1 nova_compute[183403]: 2026-01-26 15:30:33.201 183407 DEBUG nova.network.neutron [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 2239680b-85e2-414c-a1a8-69e506b41f64] Updating instance_info_cache with network_info: [{"id": "0889ea6c-5b91-46c9-aefc-a5b43a896e33", "address": "fa:16:3e:a9:16:0a", "network": {"id": "d7bb9409-21ac-404c-881a-401a33317e0b", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1176937166-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce5a22c9e1b44c8688bb5ce1d0d3ef81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0889ea6c-5b", "ovs_interfaceid": "0889ea6c-5b91-46c9-aefc-a5b43a896e33", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:30:33 compute-1 nova_compute[183403]: 2026-01-26 15:30:33.709 183407 DEBUG oslo_concurrency.lockutils [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Releasing lock "refresh_cache-2239680b-85e2-414c-a1a8-69e506b41f64" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 15:30:33 compute-1 nova_compute[183403]: 2026-01-26 15:30:33.725 183407 DEBUG nova.virt.libvirt.driver [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 2239680b-85e2-414c-a1a8-69e506b41f64] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp1q80k76o',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='2239680b-85e2-414c-a1a8-69e506b41f64',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Jan 26 15:30:33 compute-1 nova_compute[183403]: 2026-01-26 15:30:33.726 183407 DEBUG nova.virt.libvirt.driver [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 2239680b-85e2-414c-a1a8-69e506b41f64] Creating instance directory: /var/lib/nova/instances/2239680b-85e2-414c-a1a8-69e506b41f64 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Jan 26 15:30:33 compute-1 nova_compute[183403]: 2026-01-26 15:30:33.727 183407 DEBUG nova.virt.libvirt.driver [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 2239680b-85e2-414c-a1a8-69e506b41f64] Creating disk.info with the contents: {'/var/lib/nova/instances/2239680b-85e2-414c-a1a8-69e506b41f64/disk': 'qcow2', '/var/lib/nova/instances/2239680b-85e2-414c-a1a8-69e506b41f64/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Jan 26 15:30:33 compute-1 nova_compute[183403]: 2026-01-26 15:30:33.727 183407 DEBUG nova.virt.libvirt.driver [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 2239680b-85e2-414c-a1a8-69e506b41f64] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Jan 26 15:30:33 compute-1 nova_compute[183403]: 2026-01-26 15:30:33.728 183407 DEBUG nova.objects.instance [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lazy-loading 'trusted_certs' on Instance uuid 2239680b-85e2-414c-a1a8-69e506b41f64 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:30:33 compute-1 podman[212041]: 2026-01-26 15:30:33.906212337 +0000 UTC m=+0.072941545 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 26 15:30:33 compute-1 podman[212040]: 2026-01-26 15:30:33.935061388 +0000 UTC m=+0.103391509 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 26 15:30:34 compute-1 nova_compute[183403]: 2026-01-26 15:30:34.235 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:30:34 compute-1 nova_compute[183403]: 2026-01-26 15:30:34.243 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:30:34 compute-1 nova_compute[183403]: 2026-01-26 15:30:34.245 183407 DEBUG oslo_concurrency.processutils [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:30:34 compute-1 nova_compute[183403]: 2026-01-26 15:30:34.335 183407 DEBUG oslo_concurrency.processutils [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:30:34 compute-1 nova_compute[183403]: 2026-01-26 15:30:34.337 183407 DEBUG oslo_concurrency.lockutils [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:30:34 compute-1 nova_compute[183403]: 2026-01-26 15:30:34.338 183407 DEBUG oslo_concurrency.lockutils [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:30:34 compute-1 nova_compute[183403]: 2026-01-26 15:30:34.339 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:30:34 compute-1 nova_compute[183403]: 2026-01-26 15:30:34.345 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:30:34 compute-1 nova_compute[183403]: 2026-01-26 15:30:34.345 183407 DEBUG oslo_concurrency.processutils [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:30:34 compute-1 nova_compute[183403]: 2026-01-26 15:30:34.429 183407 DEBUG oslo_concurrency.processutils [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:30:34 compute-1 nova_compute[183403]: 2026-01-26 15:30:34.430 183407 DEBUG oslo_concurrency.processutils [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0,backing_fmt=raw /var/lib/nova/instances/2239680b-85e2-414c-a1a8-69e506b41f64/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:30:34 compute-1 nova_compute[183403]: 2026-01-26 15:30:34.483 183407 DEBUG oslo_concurrency.processutils [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0,backing_fmt=raw /var/lib/nova/instances/2239680b-85e2-414c-a1a8-69e506b41f64/disk 1073741824" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:30:34 compute-1 nova_compute[183403]: 2026-01-26 15:30:34.485 183407 DEBUG oslo_concurrency.lockutils [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.147s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:30:34 compute-1 nova_compute[183403]: 2026-01-26 15:30:34.486 183407 DEBUG oslo_concurrency.processutils [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:30:34 compute-1 nova_compute[183403]: 2026-01-26 15:30:34.575 183407 DEBUG oslo_concurrency.processutils [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:30:34 compute-1 nova_compute[183403]: 2026-01-26 15:30:34.576 183407 DEBUG nova.virt.disk.api [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Checking if we can resize image /var/lib/nova/instances/2239680b-85e2-414c-a1a8-69e506b41f64/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 26 15:30:34 compute-1 nova_compute[183403]: 2026-01-26 15:30:34.576 183407 DEBUG oslo_concurrency.processutils [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2239680b-85e2-414c-a1a8-69e506b41f64/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:30:34 compute-1 nova_compute[183403]: 2026-01-26 15:30:34.629 183407 DEBUG oslo_concurrency.processutils [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2239680b-85e2-414c-a1a8-69e506b41f64/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:30:34 compute-1 nova_compute[183403]: 2026-01-26 15:30:34.630 183407 DEBUG nova.virt.disk.api [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Cannot resize image /var/lib/nova/instances/2239680b-85e2-414c-a1a8-69e506b41f64/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 26 15:30:34 compute-1 nova_compute[183403]: 2026-01-26 15:30:34.630 183407 DEBUG nova.objects.instance [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lazy-loading 'migration_context' on Instance uuid 2239680b-85e2-414c-a1a8-69e506b41f64 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:30:35 compute-1 nova_compute[183403]: 2026-01-26 15:30:35.139 183407 DEBUG nova.objects.base [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Object Instance<2239680b-85e2-414c-a1a8-69e506b41f64> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Jan 26 15:30:35 compute-1 nova_compute[183403]: 2026-01-26 15:30:35.140 183407 DEBUG oslo_concurrency.processutils [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/2239680b-85e2-414c-a1a8-69e506b41f64/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:30:35 compute-1 nova_compute[183403]: 2026-01-26 15:30:35.177 183407 DEBUG oslo_concurrency.processutils [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/2239680b-85e2-414c-a1a8-69e506b41f64/disk.config 497664" returned: 0 in 0.037s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:30:35 compute-1 nova_compute[183403]: 2026-01-26 15:30:35.178 183407 DEBUG nova.virt.libvirt.driver [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 2239680b-85e2-414c-a1a8-69e506b41f64] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Jan 26 15:30:35 compute-1 nova_compute[183403]: 2026-01-26 15:30:35.181 183407 DEBUG nova.virt.libvirt.vif [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-26T15:29:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-556438860',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-556438860',id=22,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T15:29:44Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='2c23d857cca949afb2559c9276298f2f',ramdisk_id='',reservation_id='r-seb1ykz9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-418427150',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-418427150-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T15:29:44Z,user_data=None,user_id='d0d771a34e2643d782edb3717de7f449',uuid=2239680b-85e2-414c-a1a8-69e506b41f64,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0889ea6c-5b91-46c9-aefc-a5b43a896e33", "address": "fa:16:3e:a9:16:0a", "network": {"id": "d7bb9409-21ac-404c-881a-401a33317e0b", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1176937166-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce5a22c9e1b44c8688bb5ce1d0d3ef81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap0889ea6c-5b", "ovs_interfaceid": "0889ea6c-5b91-46c9-aefc-a5b43a896e33", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 26 15:30:35 compute-1 nova_compute[183403]: 2026-01-26 15:30:35.181 183407 DEBUG nova.network.os_vif_util [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Converting VIF {"id": "0889ea6c-5b91-46c9-aefc-a5b43a896e33", "address": "fa:16:3e:a9:16:0a", "network": {"id": "d7bb9409-21ac-404c-881a-401a33317e0b", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1176937166-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce5a22c9e1b44c8688bb5ce1d0d3ef81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap0889ea6c-5b", "ovs_interfaceid": "0889ea6c-5b91-46c9-aefc-a5b43a896e33", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:30:35 compute-1 nova_compute[183403]: 2026-01-26 15:30:35.183 183407 DEBUG nova.network.os_vif_util [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a9:16:0a,bridge_name='br-int',has_traffic_filtering=True,id=0889ea6c-5b91-46c9-aefc-a5b43a896e33,network=Network(d7bb9409-21ac-404c-881a-401a33317e0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0889ea6c-5b') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:30:35 compute-1 nova_compute[183403]: 2026-01-26 15:30:35.184 183407 DEBUG os_vif [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a9:16:0a,bridge_name='br-int',has_traffic_filtering=True,id=0889ea6c-5b91-46c9-aefc-a5b43a896e33,network=Network(d7bb9409-21ac-404c-881a-401a33317e0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0889ea6c-5b') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 26 15:30:35 compute-1 nova_compute[183403]: 2026-01-26 15:30:35.185 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:30:35 compute-1 nova_compute[183403]: 2026-01-26 15:30:35.186 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:30:35 compute-1 nova_compute[183403]: 2026-01-26 15:30:35.187 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:30:35 compute-1 nova_compute[183403]: 2026-01-26 15:30:35.188 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:30:35 compute-1 nova_compute[183403]: 2026-01-26 15:30:35.188 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'bf4b0d3c-6a6c-5fd3-b088-cc00cb2155eb', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:30:35 compute-1 nova_compute[183403]: 2026-01-26 15:30:35.190 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:30:35 compute-1 nova_compute[183403]: 2026-01-26 15:30:35.192 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:30:35 compute-1 nova_compute[183403]: 2026-01-26 15:30:35.209 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:30:35 compute-1 nova_compute[183403]: 2026-01-26 15:30:35.210 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0889ea6c-5b, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:30:35 compute-1 nova_compute[183403]: 2026-01-26 15:30:35.211 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap0889ea6c-5b, col_values=(('qos', UUID('14438c0f-14d5-43ee-ad65-6749cf806539')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:30:35 compute-1 nova_compute[183403]: 2026-01-26 15:30:35.211 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap0889ea6c-5b, col_values=(('external_ids', {'iface-id': '0889ea6c-5b91-46c9-aefc-a5b43a896e33', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a9:16:0a', 'vm-uuid': '2239680b-85e2-414c-a1a8-69e506b41f64'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:30:35 compute-1 NetworkManager[55716]: <info>  [1769441435.2158] manager: (tap0889ea6c-5b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/67)
Jan 26 15:30:35 compute-1 nova_compute[183403]: 2026-01-26 15:30:35.218 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:30:35 compute-1 nova_compute[183403]: 2026-01-26 15:30:35.223 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:30:35 compute-1 nova_compute[183403]: 2026-01-26 15:30:35.224 183407 INFO os_vif [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a9:16:0a,bridge_name='br-int',has_traffic_filtering=True,id=0889ea6c-5b91-46c9-aefc-a5b43a896e33,network=Network(d7bb9409-21ac-404c-881a-401a33317e0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0889ea6c-5b')
Jan 26 15:30:35 compute-1 nova_compute[183403]: 2026-01-26 15:30:35.225 183407 DEBUG nova.virt.libvirt.driver [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Jan 26 15:30:35 compute-1 nova_compute[183403]: 2026-01-26 15:30:35.225 183407 DEBUG nova.compute.manager [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp1q80k76o',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='2239680b-85e2-414c-a1a8-69e506b41f64',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9381
Jan 26 15:30:35 compute-1 nova_compute[183403]: 2026-01-26 15:30:35.226 183407 WARNING neutronclient.v2_0.client [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:30:35 compute-1 nova_compute[183403]: 2026-01-26 15:30:35.350 183407 WARNING neutronclient.v2_0.client [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:30:35 compute-1 nova_compute[183403]: 2026-01-26 15:30:35.629 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:30:35 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:30:35.635 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:ac:38', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'de:96:40:90:f3:e5'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:30:35 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:30:35.636 104930 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 15:30:35 compute-1 podman[192725]: time="2026-01-26T15:30:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:30:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:30:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:30:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:30:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2197 "" "Go-http-client/1.1"
Jan 26 15:30:35 compute-1 nova_compute[183403]: 2026-01-26 15:30:35.989 183407 DEBUG nova.network.neutron [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 2239680b-85e2-414c-a1a8-69e506b41f64] Port 0889ea6c-5b91-46c9-aefc-a5b43a896e33 updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Jan 26 15:30:36 compute-1 nova_compute[183403]: 2026-01-26 15:30:36.006 183407 DEBUG nova.compute.manager [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp1q80k76o',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='2239680b-85e2-414c-a1a8-69e506b41f64',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9447
Jan 26 15:30:36 compute-1 nova_compute[183403]: 2026-01-26 15:30:36.200 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:30:36 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:30:36.637 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=41380b5a-e321-4ce4-bcc6-ecd563b3c793, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:30:39 compute-1 systemd[1]: Starting libvirt proxy daemon...
Jan 26 15:30:39 compute-1 systemd[1]: Started libvirt proxy daemon.
Jan 26 15:30:39 compute-1 kernel: tap0889ea6c-5b: entered promiscuous mode
Jan 26 15:30:39 compute-1 NetworkManager[55716]: <info>  [1769441439.6028] manager: (tap0889ea6c-5b): new Tun device (/org/freedesktop/NetworkManager/Devices/68)
Jan 26 15:30:39 compute-1 ovn_controller[95641]: 2026-01-26T15:30:39Z|00173|binding|INFO|Claiming lport 0889ea6c-5b91-46c9-aefc-a5b43a896e33 for this additional chassis.
Jan 26 15:30:39 compute-1 ovn_controller[95641]: 2026-01-26T15:30:39Z|00174|binding|INFO|0889ea6c-5b91-46c9-aefc-a5b43a896e33: Claiming fa:16:3e:a9:16:0a 10.100.0.12
Jan 26 15:30:39 compute-1 nova_compute[183403]: 2026-01-26 15:30:39.606 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:30:39 compute-1 nova_compute[183403]: 2026-01-26 15:30:39.608 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:30:39 compute-1 nova_compute[183403]: 2026-01-26 15:30:39.615 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:30:39 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:30:39.622 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a9:16:0a 10.100.0.12'], port_security=['fa:16:3e:a9:16:0a 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '2239680b-85e2-414c-a1a8-69e506b41f64', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7bb9409-21ac-404c-881a-401a33317e0b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2c23d857cca949afb2559c9276298f2f', 'neutron:revision_number': '10', 'neutron:security_group_ids': '2816b4e0-6d42-4df2-a497-a52d8e0e90c8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2fe2c46a-3344-4b49-9cc4-4db510e2e673, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=0889ea6c-5b91-46c9-aefc-a5b43a896e33) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:30:39 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:30:39.623 104930 INFO neutron.agent.ovn.metadata.agent [-] Port 0889ea6c-5b91-46c9-aefc-a5b43a896e33 in datapath d7bb9409-21ac-404c-881a-401a33317e0b unbound from our chassis
Jan 26 15:30:39 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:30:39.625 104930 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d7bb9409-21ac-404c-881a-401a33317e0b
Jan 26 15:30:39 compute-1 systemd-udevd[212139]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 15:30:39 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:30:39.645 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[9313117d-be84-416c-93d7-9a444a60359d]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:30:39 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:30:39.645 104930 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd7bb9409-21 in ovnmeta-d7bb9409-21ac-404c-881a-401a33317e0b namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Jan 26 15:30:39 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:30:39.649 203506 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd7bb9409-20 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Jan 26 15:30:39 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:30:39.649 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[4da0d562-6e2a-47e0-b579-1ad9d6572d26]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:30:39 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:30:39.650 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[7e8cfc36-c987-46d1-9b81-922608069037]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:30:39 compute-1 systemd-machined[154697]: New machine qemu-16-instance-00000016.
Jan 26 15:30:39 compute-1 NetworkManager[55716]: <info>  [1769441439.6595] device (tap0889ea6c-5b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 15:30:39 compute-1 NetworkManager[55716]: <info>  [1769441439.6612] device (tap0889ea6c-5b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 15:30:39 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:30:39.669 105448 DEBUG oslo.privsep.daemon [-] privsep: reply[9953b042-9816-4907-96ae-9c05b9cfb75a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:30:39 compute-1 nova_compute[183403]: 2026-01-26 15:30:39.687 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:30:39 compute-1 ovn_controller[95641]: 2026-01-26T15:30:39Z|00175|binding|INFO|Setting lport 0889ea6c-5b91-46c9-aefc-a5b43a896e33 ovn-installed in OVS
Jan 26 15:30:39 compute-1 nova_compute[183403]: 2026-01-26 15:30:39.693 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:30:39 compute-1 systemd[1]: Started Virtual Machine qemu-16-instance-00000016.
Jan 26 15:30:39 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:30:39.697 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[c588bc87-32df-41a2-924c-52746f9407fa]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:30:39 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:30:39.744 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[3d5455b9-3e26-48a4-9c0e-dbbd69eca27f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:30:39 compute-1 systemd-udevd[212143]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 15:30:39 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:30:39.752 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[42a494be-4fa4-4d46-acc4-63f7850ac2c8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:30:39 compute-1 NetworkManager[55716]: <info>  [1769441439.7538] manager: (tapd7bb9409-20): new Veth device (/org/freedesktop/NetworkManager/Devices/69)
Jan 26 15:30:39 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:30:39.796 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[c327fdb1-7e21-4099-806e-6b09d39a20c0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:30:39 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:30:39.799 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[653c4513-251f-4d30-8083-260540d062bb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:30:39 compute-1 NetworkManager[55716]: <info>  [1769441439.8336] device (tapd7bb9409-20): carrier: link connected
Jan 26 15:30:39 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:30:39.842 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[fad18572-75a6-4cc3-b0fb-7d004673f24f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:30:39 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:30:39.867 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[a2891137-e8d5-4d17-a88b-d55e9213ac63]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7bb9409-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1a:8f:ee'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 50], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 516243, 'reachable_time': 15685, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212172, 'error': None, 'target': 'ovnmeta-d7bb9409-21ac-404c-881a-401a33317e0b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:30:39 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:30:39.888 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[bcdcef32-9079-4225-ae8d-0314b2275a1b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1a:8fee'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 516243, 'tstamp': 516243}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212173, 'error': None, 'target': 'ovnmeta-d7bb9409-21ac-404c-881a-401a33317e0b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:30:39 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:30:39.914 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[fc4f29b2-f1f7-4323-bac2-4818850ef36d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7bb9409-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1a:8f:ee'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 50], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 516243, 'reachable_time': 15685, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 212174, 'error': None, 'target': 'ovnmeta-d7bb9409-21ac-404c-881a-401a33317e0b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:30:39 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:30:39.959 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[d3d172ef-c3b5-4d8c-b622-0c8091ee7999]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:30:40 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:30:40.045 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[6a2ca796-3541-4bd0-8d95-03a65eaf89e8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:30:40 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:30:40.048 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7bb9409-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:30:40 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:30:40.049 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:30:40 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:30:40.049 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd7bb9409-20, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:30:40 compute-1 NetworkManager[55716]: <info>  [1769441440.0523] manager: (tapd7bb9409-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/70)
Jan 26 15:30:40 compute-1 nova_compute[183403]: 2026-01-26 15:30:40.051 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:30:40 compute-1 kernel: tapd7bb9409-20: entered promiscuous mode
Jan 26 15:30:40 compute-1 nova_compute[183403]: 2026-01-26 15:30:40.054 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:30:40 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:30:40.056 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd7bb9409-20, col_values=(('external_ids', {'iface-id': 'fe81a3ca-59e0-4072-a668-b5a59e5f3940'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:30:40 compute-1 nova_compute[183403]: 2026-01-26 15:30:40.057 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:30:40 compute-1 ovn_controller[95641]: 2026-01-26T15:30:40Z|00176|binding|INFO|Releasing lport fe81a3ca-59e0-4072-a668-b5a59e5f3940 from this chassis (sb_readonly=0)
Jan 26 15:30:40 compute-1 nova_compute[183403]: 2026-01-26 15:30:40.079 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:30:40 compute-1 nova_compute[183403]: 2026-01-26 15:30:40.082 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:30:40 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:30:40.084 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[a9464fac-ac09-48f4-b20a-7c0f26a02775]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:30:40 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:30:40.085 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d7bb9409-21ac-404c-881a-401a33317e0b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d7bb9409-21ac-404c-881a-401a33317e0b.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:30:40 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:30:40.085 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d7bb9409-21ac-404c-881a-401a33317e0b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d7bb9409-21ac-404c-881a-401a33317e0b.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:30:40 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:30:40.086 104930 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for d7bb9409-21ac-404c-881a-401a33317e0b disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Jan 26 15:30:40 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:30:40.086 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d7bb9409-21ac-404c-881a-401a33317e0b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d7bb9409-21ac-404c-881a-401a33317e0b.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:30:40 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:30:40.087 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[a4ae2a20-1a9f-49f3-aa79-714c3137a786]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:30:40 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:30:40.087 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d7bb9409-21ac-404c-881a-401a33317e0b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d7bb9409-21ac-404c-881a-401a33317e0b.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:30:40 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:30:40.088 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[0b05e95e-5981-4426-b7bb-f249aa6e8bbf]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:30:40 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:30:40.089 104930 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Jan 26 15:30:40 compute-1 ovn_metadata_agent[104924]: global
Jan 26 15:30:40 compute-1 ovn_metadata_agent[104924]:     log         /dev/log local0 debug
Jan 26 15:30:40 compute-1 ovn_metadata_agent[104924]:     log-tag     haproxy-metadata-proxy-d7bb9409-21ac-404c-881a-401a33317e0b
Jan 26 15:30:40 compute-1 ovn_metadata_agent[104924]:     user        root
Jan 26 15:30:40 compute-1 ovn_metadata_agent[104924]:     group       root
Jan 26 15:30:40 compute-1 ovn_metadata_agent[104924]:     maxconn     1024
Jan 26 15:30:40 compute-1 ovn_metadata_agent[104924]:     pidfile     /var/lib/neutron/external/pids/d7bb9409-21ac-404c-881a-401a33317e0b.pid.haproxy
Jan 26 15:30:40 compute-1 ovn_metadata_agent[104924]:     daemon
Jan 26 15:30:40 compute-1 ovn_metadata_agent[104924]: 
Jan 26 15:30:40 compute-1 ovn_metadata_agent[104924]: defaults
Jan 26 15:30:40 compute-1 ovn_metadata_agent[104924]:     log global
Jan 26 15:30:40 compute-1 ovn_metadata_agent[104924]:     mode http
Jan 26 15:30:40 compute-1 ovn_metadata_agent[104924]:     option httplog
Jan 26 15:30:40 compute-1 ovn_metadata_agent[104924]:     option dontlognull
Jan 26 15:30:40 compute-1 ovn_metadata_agent[104924]:     option http-server-close
Jan 26 15:30:40 compute-1 ovn_metadata_agent[104924]:     option forwardfor
Jan 26 15:30:40 compute-1 ovn_metadata_agent[104924]:     retries                 3
Jan 26 15:30:40 compute-1 ovn_metadata_agent[104924]:     timeout http-request    30s
Jan 26 15:30:40 compute-1 ovn_metadata_agent[104924]:     timeout connect         30s
Jan 26 15:30:40 compute-1 ovn_metadata_agent[104924]:     timeout client          32s
Jan 26 15:30:40 compute-1 ovn_metadata_agent[104924]:     timeout server          32s
Jan 26 15:30:40 compute-1 ovn_metadata_agent[104924]:     timeout http-keep-alive 30s
Jan 26 15:30:40 compute-1 ovn_metadata_agent[104924]: 
Jan 26 15:30:40 compute-1 ovn_metadata_agent[104924]: listen listener
Jan 26 15:30:40 compute-1 ovn_metadata_agent[104924]:     bind 169.254.169.254:80
Jan 26 15:30:40 compute-1 ovn_metadata_agent[104924]:     
Jan 26 15:30:40 compute-1 ovn_metadata_agent[104924]:     server metadata /var/lib/neutron/metadata_proxy
Jan 26 15:30:40 compute-1 ovn_metadata_agent[104924]: 
Jan 26 15:30:40 compute-1 ovn_metadata_agent[104924]:     http-request add-header X-OVN-Network-ID d7bb9409-21ac-404c-881a-401a33317e0b
Jan 26 15:30:40 compute-1 ovn_metadata_agent[104924]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Jan 26 15:30:40 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:30:40.090 104930 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d7bb9409-21ac-404c-881a-401a33317e0b', 'env', 'PROCESS_TAG=haproxy-d7bb9409-21ac-404c-881a-401a33317e0b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d7bb9409-21ac-404c-881a-401a33317e0b.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Jan 26 15:30:40 compute-1 nova_compute[183403]: 2026-01-26 15:30:40.214 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:30:40 compute-1 podman[212206]: 2026-01-26 15:30:40.586687556 +0000 UTC m=+0.088686311 container create 2fcedade8b59027a6392047f5849ba03804cd7d23e6d85298dc3840f25eb992f (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d7bb9409-21ac-404c-881a-401a33317e0b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 26 15:30:40 compute-1 systemd[1]: Started libpod-conmon-2fcedade8b59027a6392047f5849ba03804cd7d23e6d85298dc3840f25eb992f.scope.
Jan 26 15:30:40 compute-1 podman[212206]: 2026-01-26 15:30:40.544237258 +0000 UTC m=+0.046236093 image pull d5bf96c5225682608353c2a38183b39c74c7c48343b54a579b3b6f3d81996637 38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Jan 26 15:30:40 compute-1 systemd[1]: Started libcrun container.
Jan 26 15:30:40 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9dbc3b285ad32abffcf14645aedb64952a25d166d64551d8599432942b0a54cf/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 15:30:40 compute-1 podman[212206]: 2026-01-26 15:30:40.691962846 +0000 UTC m=+0.193961751 container init 2fcedade8b59027a6392047f5849ba03804cd7d23e6d85298dc3840f25eb992f (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d7bb9409-21ac-404c-881a-401a33317e0b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 26 15:30:40 compute-1 podman[212206]: 2026-01-26 15:30:40.70172589 +0000 UTC m=+0.203724655 container start 2fcedade8b59027a6392047f5849ba03804cd7d23e6d85298dc3840f25eb992f (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d7bb9409-21ac-404c-881a-401a33317e0b, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 15:30:40 compute-1 neutron-haproxy-ovnmeta-d7bb9409-21ac-404c-881a-401a33317e0b[212227]: [NOTICE]   (212231) : New worker (212233) forked
Jan 26 15:30:40 compute-1 neutron-haproxy-ovnmeta-d7bb9409-21ac-404c-881a-401a33317e0b[212227]: [NOTICE]   (212231) : Loading success.
Jan 26 15:30:41 compute-1 nova_compute[183403]: 2026-01-26 15:30:41.204 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:30:42 compute-1 ovn_controller[95641]: 2026-01-26T15:30:42Z|00177|binding|INFO|Claiming lport 0889ea6c-5b91-46c9-aefc-a5b43a896e33 for this chassis.
Jan 26 15:30:42 compute-1 ovn_controller[95641]: 2026-01-26T15:30:42Z|00178|binding|INFO|0889ea6c-5b91-46c9-aefc-a5b43a896e33: Claiming fa:16:3e:a9:16:0a 10.100.0.12
Jan 26 15:30:42 compute-1 ovn_controller[95641]: 2026-01-26T15:30:42Z|00179|binding|INFO|Setting lport 0889ea6c-5b91-46c9-aefc-a5b43a896e33 up in Southbound
Jan 26 15:30:43 compute-1 nova_compute[183403]: 2026-01-26 15:30:43.548 183407 INFO nova.compute.manager [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 2239680b-85e2-414c-a1a8-69e506b41f64] Post operation of migration started
Jan 26 15:30:43 compute-1 nova_compute[183403]: 2026-01-26 15:30:43.549 183407 WARNING neutronclient.v2_0.client [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:30:44 compute-1 nova_compute[183403]: 2026-01-26 15:30:44.112 183407 WARNING neutronclient.v2_0.client [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:30:44 compute-1 nova_compute[183403]: 2026-01-26 15:30:44.113 183407 WARNING neutronclient.v2_0.client [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:30:44 compute-1 nova_compute[183403]: 2026-01-26 15:30:44.191 183407 DEBUG oslo_concurrency.lockutils [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "refresh_cache-2239680b-85e2-414c-a1a8-69e506b41f64" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 15:30:44 compute-1 nova_compute[183403]: 2026-01-26 15:30:44.192 183407 DEBUG oslo_concurrency.lockutils [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquired lock "refresh_cache-2239680b-85e2-414c-a1a8-69e506b41f64" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 15:30:44 compute-1 nova_compute[183403]: 2026-01-26 15:30:44.192 183407 DEBUG nova.network.neutron [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 2239680b-85e2-414c-a1a8-69e506b41f64] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 15:30:44 compute-1 nova_compute[183403]: 2026-01-26 15:30:44.699 183407 WARNING neutronclient.v2_0.client [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:30:45 compute-1 nova_compute[183403]: 2026-01-26 15:30:45.216 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:30:45 compute-1 nova_compute[183403]: 2026-01-26 15:30:45.437 183407 WARNING neutronclient.v2_0.client [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:30:45 compute-1 nova_compute[183403]: 2026-01-26 15:30:45.603 183407 DEBUG nova.network.neutron [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 2239680b-85e2-414c-a1a8-69e506b41f64] Updating instance_info_cache with network_info: [{"id": "0889ea6c-5b91-46c9-aefc-a5b43a896e33", "address": "fa:16:3e:a9:16:0a", "network": {"id": "d7bb9409-21ac-404c-881a-401a33317e0b", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1176937166-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce5a22c9e1b44c8688bb5ce1d0d3ef81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0889ea6c-5b", "ovs_interfaceid": "0889ea6c-5b91-46c9-aefc-a5b43a896e33", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:30:46 compute-1 nova_compute[183403]: 2026-01-26 15:30:46.113 183407 DEBUG oslo_concurrency.lockutils [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Releasing lock "refresh_cache-2239680b-85e2-414c-a1a8-69e506b41f64" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 15:30:46 compute-1 nova_compute[183403]: 2026-01-26 15:30:46.206 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:30:46 compute-1 nova_compute[183403]: 2026-01-26 15:30:46.642 183407 DEBUG oslo_concurrency.lockutils [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:30:46 compute-1 nova_compute[183403]: 2026-01-26 15:30:46.643 183407 DEBUG oslo_concurrency.lockutils [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:30:46 compute-1 nova_compute[183403]: 2026-01-26 15:30:46.644 183407 DEBUG oslo_concurrency.lockutils [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:30:46 compute-1 nova_compute[183403]: 2026-01-26 15:30:46.649 183407 INFO nova.virt.libvirt.driver [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 2239680b-85e2-414c-a1a8-69e506b41f64] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Jan 26 15:30:46 compute-1 virtqemud[183290]: Domain id=16 name='instance-00000016' uuid=2239680b-85e2-414c-a1a8-69e506b41f64 is tainted: custom-monitor
Jan 26 15:30:47 compute-1 nova_compute[183403]: 2026-01-26 15:30:47.658 183407 INFO nova.virt.libvirt.driver [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 2239680b-85e2-414c-a1a8-69e506b41f64] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Jan 26 15:30:48 compute-1 nova_compute[183403]: 2026-01-26 15:30:48.664 183407 INFO nova.virt.libvirt.driver [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 2239680b-85e2-414c-a1a8-69e506b41f64] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Jan 26 15:30:48 compute-1 nova_compute[183403]: 2026-01-26 15:30:48.669 183407 DEBUG nova.compute.manager [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 2239680b-85e2-414c-a1a8-69e506b41f64] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Jan 26 15:30:49 compute-1 nova_compute[183403]: 2026-01-26 15:30:49.201 183407 DEBUG nova.objects.instance [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 2239680b-85e2-414c-a1a8-69e506b41f64] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Jan 26 15:30:49 compute-1 openstack_network_exporter[195610]: ERROR   15:30:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:30:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:30:49 compute-1 openstack_network_exporter[195610]: ERROR   15:30:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:30:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:30:50 compute-1 nova_compute[183403]: 2026-01-26 15:30:50.218 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:30:50 compute-1 nova_compute[183403]: 2026-01-26 15:30:50.225 183407 WARNING neutronclient.v2_0.client [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:30:51 compute-1 nova_compute[183403]: 2026-01-26 15:30:51.120 183407 WARNING neutronclient.v2_0.client [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:30:51 compute-1 nova_compute[183403]: 2026-01-26 15:30:51.120 183407 WARNING neutronclient.v2_0.client [None req-0f23ae9a-59ae-4168-8841-fbb9cf304b9a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:30:51 compute-1 nova_compute[183403]: 2026-01-26 15:30:51.249 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:30:53 compute-1 podman[212249]: 2026-01-26 15:30:53.888809784 +0000 UTC m=+0.070583741 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 15:30:53 compute-1 podman[212250]: 2026-01-26 15:30:53.912209888 +0000 UTC m=+0.080944782 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vendor=Red Hat, Inc., container_name=openstack_network_exporter, architecture=x86_64, release=1755695350)
Jan 26 15:30:55 compute-1 nova_compute[183403]: 2026-01-26 15:30:55.223 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:30:56 compute-1 nova_compute[183403]: 2026-01-26 15:30:56.281 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:31:00 compute-1 nova_compute[183403]: 2026-01-26 15:31:00.226 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:31:01 compute-1 nova_compute[183403]: 2026-01-26 15:31:01.284 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:31:04 compute-1 podman[212296]: 2026-01-26 15:31:04.900398966 +0000 UTC m=+0.070439508 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 26 15:31:04 compute-1 podman[212295]: 2026-01-26 15:31:04.99775369 +0000 UTC m=+0.168605143 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Jan 26 15:31:05 compute-1 nova_compute[183403]: 2026-01-26 15:31:05.122 183407 DEBUG oslo_concurrency.lockutils [None req-4b982b9b-e1af-435e-b046-d73112e7bbfe d0d771a34e2643d782edb3717de7f449 2c23d857cca949afb2559c9276298f2f - - default default] Acquiring lock "2239680b-85e2-414c-a1a8-69e506b41f64" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:31:05 compute-1 nova_compute[183403]: 2026-01-26 15:31:05.123 183407 DEBUG oslo_concurrency.lockutils [None req-4b982b9b-e1af-435e-b046-d73112e7bbfe d0d771a34e2643d782edb3717de7f449 2c23d857cca949afb2559c9276298f2f - - default default] Lock "2239680b-85e2-414c-a1a8-69e506b41f64" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:31:05 compute-1 nova_compute[183403]: 2026-01-26 15:31:05.124 183407 DEBUG oslo_concurrency.lockutils [None req-4b982b9b-e1af-435e-b046-d73112e7bbfe d0d771a34e2643d782edb3717de7f449 2c23d857cca949afb2559c9276298f2f - - default default] Acquiring lock "2239680b-85e2-414c-a1a8-69e506b41f64-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:31:05 compute-1 nova_compute[183403]: 2026-01-26 15:31:05.124 183407 DEBUG oslo_concurrency.lockutils [None req-4b982b9b-e1af-435e-b046-d73112e7bbfe d0d771a34e2643d782edb3717de7f449 2c23d857cca949afb2559c9276298f2f - - default default] Lock "2239680b-85e2-414c-a1a8-69e506b41f64-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:31:05 compute-1 nova_compute[183403]: 2026-01-26 15:31:05.125 183407 DEBUG oslo_concurrency.lockutils [None req-4b982b9b-e1af-435e-b046-d73112e7bbfe d0d771a34e2643d782edb3717de7f449 2c23d857cca949afb2559c9276298f2f - - default default] Lock "2239680b-85e2-414c-a1a8-69e506b41f64-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:31:05 compute-1 nova_compute[183403]: 2026-01-26 15:31:05.152 183407 INFO nova.compute.manager [None req-4b982b9b-e1af-435e-b046-d73112e7bbfe d0d771a34e2643d782edb3717de7f449 2c23d857cca949afb2559c9276298f2f - - default default] [instance: 2239680b-85e2-414c-a1a8-69e506b41f64] Terminating instance
Jan 26 15:31:05 compute-1 nova_compute[183403]: 2026-01-26 15:31:05.228 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:31:05 compute-1 podman[192725]: time="2026-01-26T15:31:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:31:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:31:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16575 "" "Go-http-client/1.1"
Jan 26 15:31:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:31:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2660 "" "Go-http-client/1.1"
Jan 26 15:31:05 compute-1 nova_compute[183403]: 2026-01-26 15:31:05.683 183407 DEBUG nova.compute.manager [None req-4b982b9b-e1af-435e-b046-d73112e7bbfe d0d771a34e2643d782edb3717de7f449 2c23d857cca949afb2559c9276298f2f - - default default] [instance: 2239680b-85e2-414c-a1a8-69e506b41f64] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Jan 26 15:31:05 compute-1 kernel: tap0889ea6c-5b (unregistering): left promiscuous mode
Jan 26 15:31:05 compute-1 NetworkManager[55716]: <info>  [1769441465.7094] device (tap0889ea6c-5b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 15:31:05 compute-1 ovn_controller[95641]: 2026-01-26T15:31:05Z|00180|binding|INFO|Releasing lport 0889ea6c-5b91-46c9-aefc-a5b43a896e33 from this chassis (sb_readonly=0)
Jan 26 15:31:05 compute-1 ovn_controller[95641]: 2026-01-26T15:31:05Z|00181|binding|INFO|Setting lport 0889ea6c-5b91-46c9-aefc-a5b43a896e33 down in Southbound
Jan 26 15:31:05 compute-1 ovn_controller[95641]: 2026-01-26T15:31:05Z|00182|binding|INFO|Removing iface tap0889ea6c-5b ovn-installed in OVS
Jan 26 15:31:05 compute-1 nova_compute[183403]: 2026-01-26 15:31:05.722 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:31:05 compute-1 nova_compute[183403]: 2026-01-26 15:31:05.725 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:31:05 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:31:05.730 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a9:16:0a 10.100.0.12'], port_security=['fa:16:3e:a9:16:0a 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '2239680b-85e2-414c-a1a8-69e506b41f64', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7bb9409-21ac-404c-881a-401a33317e0b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2c23d857cca949afb2559c9276298f2f', 'neutron:revision_number': '14', 'neutron:security_group_ids': '2816b4e0-6d42-4df2-a497-a52d8e0e90c8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2fe2c46a-3344-4b49-9cc4-4db510e2e673, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], logical_port=0889ea6c-5b91-46c9-aefc-a5b43a896e33) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:31:05 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:31:05.732 104930 INFO neutron.agent.ovn.metadata.agent [-] Port 0889ea6c-5b91-46c9-aefc-a5b43a896e33 in datapath d7bb9409-21ac-404c-881a-401a33317e0b unbound from our chassis
Jan 26 15:31:05 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:31:05.734 104930 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d7bb9409-21ac-404c-881a-401a33317e0b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 15:31:05 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:31:05.735 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[716a2ed4-d6cb-4a5a-b6a6-36ee4beb47e7]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:31:05 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:31:05.735 104930 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d7bb9409-21ac-404c-881a-401a33317e0b namespace which is not needed anymore
Jan 26 15:31:05 compute-1 nova_compute[183403]: 2026-01-26 15:31:05.760 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:31:05 compute-1 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000016.scope: Deactivated successfully.
Jan 26 15:31:05 compute-1 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000016.scope: Consumed 2.934s CPU time.
Jan 26 15:31:05 compute-1 systemd-machined[154697]: Machine qemu-16-instance-00000016 terminated.
Jan 26 15:31:05 compute-1 podman[212364]: 2026-01-26 15:31:05.9045305 +0000 UTC m=+0.046959012 container kill 2fcedade8b59027a6392047f5849ba03804cd7d23e6d85298dc3840f25eb992f (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d7bb9409-21ac-404c-881a-401a33317e0b, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260120, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Jan 26 15:31:05 compute-1 neutron-haproxy-ovnmeta-d7bb9409-21ac-404c-881a-401a33317e0b[212227]: [NOTICE]   (212231) : haproxy version is 3.0.5-8e879a5
Jan 26 15:31:05 compute-1 neutron-haproxy-ovnmeta-d7bb9409-21ac-404c-881a-401a33317e0b[212227]: [NOTICE]   (212231) : path to executable is /usr/sbin/haproxy
Jan 26 15:31:05 compute-1 neutron-haproxy-ovnmeta-d7bb9409-21ac-404c-881a-401a33317e0b[212227]: [WARNING]  (212231) : Exiting Master process...
Jan 26 15:31:05 compute-1 neutron-haproxy-ovnmeta-d7bb9409-21ac-404c-881a-401a33317e0b[212227]: [ALERT]    (212231) : Current worker (212233) exited with code 143 (Terminated)
Jan 26 15:31:05 compute-1 neutron-haproxy-ovnmeta-d7bb9409-21ac-404c-881a-401a33317e0b[212227]: [WARNING]  (212231) : All workers exited. Exiting... (0)
Jan 26 15:31:05 compute-1 systemd[1]: libpod-2fcedade8b59027a6392047f5849ba03804cd7d23e6d85298dc3840f25eb992f.scope: Deactivated successfully.
Jan 26 15:31:05 compute-1 nova_compute[183403]: 2026-01-26 15:31:05.910 183407 DEBUG nova.compute.manager [req-e4151478-1663-4797-bf97-8572c3a1e76c req-4072b718-4588-4cf0-b850-b90b3ff91d22 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 2239680b-85e2-414c-a1a8-69e506b41f64] Received event network-vif-unplugged-0889ea6c-5b91-46c9-aefc-a5b43a896e33 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:31:05 compute-1 nova_compute[183403]: 2026-01-26 15:31:05.910 183407 DEBUG oslo_concurrency.lockutils [req-e4151478-1663-4797-bf97-8572c3a1e76c req-4072b718-4588-4cf0-b850-b90b3ff91d22 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "2239680b-85e2-414c-a1a8-69e506b41f64-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:31:05 compute-1 nova_compute[183403]: 2026-01-26 15:31:05.910 183407 DEBUG oslo_concurrency.lockutils [req-e4151478-1663-4797-bf97-8572c3a1e76c req-4072b718-4588-4cf0-b850-b90b3ff91d22 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "2239680b-85e2-414c-a1a8-69e506b41f64-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:31:05 compute-1 nova_compute[183403]: 2026-01-26 15:31:05.910 183407 DEBUG oslo_concurrency.lockutils [req-e4151478-1663-4797-bf97-8572c3a1e76c req-4072b718-4588-4cf0-b850-b90b3ff91d22 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "2239680b-85e2-414c-a1a8-69e506b41f64-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:31:05 compute-1 nova_compute[183403]: 2026-01-26 15:31:05.910 183407 DEBUG nova.compute.manager [req-e4151478-1663-4797-bf97-8572c3a1e76c req-4072b718-4588-4cf0-b850-b90b3ff91d22 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 2239680b-85e2-414c-a1a8-69e506b41f64] No waiting events found dispatching network-vif-unplugged-0889ea6c-5b91-46c9-aefc-a5b43a896e33 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:31:05 compute-1 nova_compute[183403]: 2026-01-26 15:31:05.911 183407 DEBUG nova.compute.manager [req-e4151478-1663-4797-bf97-8572c3a1e76c req-4072b718-4588-4cf0-b850-b90b3ff91d22 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 2239680b-85e2-414c-a1a8-69e506b41f64] Received event network-vif-unplugged-0889ea6c-5b91-46c9-aefc-a5b43a896e33 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 15:31:05 compute-1 podman[212382]: 2026-01-26 15:31:05.955677864 +0000 UTC m=+0.028366519 container died 2fcedade8b59027a6392047f5849ba03804cd7d23e6d85298dc3840f25eb992f (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d7bb9409-21ac-404c-881a-401a33317e0b, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Jan 26 15:31:05 compute-1 nova_compute[183403]: 2026-01-26 15:31:05.973 183407 INFO nova.virt.libvirt.driver [-] [instance: 2239680b-85e2-414c-a1a8-69e506b41f64] Instance destroyed successfully.
Jan 26 15:31:05 compute-1 nova_compute[183403]: 2026-01-26 15:31:05.974 183407 DEBUG nova.objects.instance [None req-4b982b9b-e1af-435e-b046-d73112e7bbfe d0d771a34e2643d782edb3717de7f449 2c23d857cca949afb2559c9276298f2f - - default default] Lazy-loading 'resources' on Instance uuid 2239680b-85e2-414c-a1a8-69e506b41f64 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:31:05 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2fcedade8b59027a6392047f5849ba03804cd7d23e6d85298dc3840f25eb992f-userdata-shm.mount: Deactivated successfully.
Jan 26 15:31:05 compute-1 systemd[1]: var-lib-containers-storage-overlay-9dbc3b285ad32abffcf14645aedb64952a25d166d64551d8599432942b0a54cf-merged.mount: Deactivated successfully.
Jan 26 15:31:06 compute-1 podman[212382]: 2026-01-26 15:31:06.009256143 +0000 UTC m=+0.081944758 container cleanup 2fcedade8b59027a6392047f5849ba03804cd7d23e6d85298dc3840f25eb992f (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d7bb9409-21ac-404c-881a-401a33317e0b, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260120, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Jan 26 15:31:06 compute-1 systemd[1]: libpod-conmon-2fcedade8b59027a6392047f5849ba03804cd7d23e6d85298dc3840f25eb992f.scope: Deactivated successfully.
Jan 26 15:31:06 compute-1 podman[212397]: 2026-01-26 15:31:06.035792832 +0000 UTC m=+0.082880704 container remove 2fcedade8b59027a6392047f5849ba03804cd7d23e6d85298dc3840f25eb992f (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d7bb9409-21ac-404c-881a-401a33317e0b, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120)
Jan 26 15:31:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:31:06.064 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[18a69d9e-0cd7-4282-b651-694c0ae66859]: (4, ("Mon Jan 26 03:31:05 PM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-d7bb9409-21ac-404c-881a-401a33317e0b (2fcedade8b59027a6392047f5849ba03804cd7d23e6d85298dc3840f25eb992f)\n2fcedade8b59027a6392047f5849ba03804cd7d23e6d85298dc3840f25eb992f\nMon Jan 26 03:31:05 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-d7bb9409-21ac-404c-881a-401a33317e0b (2fcedade8b59027a6392047f5849ba03804cd7d23e6d85298dc3840f25eb992f)\n2fcedade8b59027a6392047f5849ba03804cd7d23e6d85298dc3840f25eb992f\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:31:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:31:06.065 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[fb0c0552-d276-4150-bfc4-0925242954e4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:31:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:31:06.066 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d7bb9409-21ac-404c-881a-401a33317e0b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d7bb9409-21ac-404c-881a-401a33317e0b.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:31:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:31:06.067 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[6730c934-f6ff-4044-af8f-29b23ae96afe]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:31:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:31:06.069 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7bb9409-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:31:06 compute-1 nova_compute[183403]: 2026-01-26 15:31:06.101 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:31:06 compute-1 kernel: tapd7bb9409-20: left promiscuous mode
Jan 26 15:31:06 compute-1 nova_compute[183403]: 2026-01-26 15:31:06.122 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:31:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:31:06.124 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[60a15dcb-8a40-43b0-ab1e-7f9327192cb5]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:31:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:31:06.142 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[811ba001-05c5-4dcc-9792-3384b3c8eec9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:31:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:31:06.143 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[00f86b11-09cc-4328-9a3a-96fc3435536b]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:31:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:31:06.165 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[a5e9fa12-a809-4973-bc38-6da0e5a730fe]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 516233, 'reachable_time': 20870, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212428, 'error': None, 'target': 'ovnmeta-d7bb9409-21ac-404c-881a-401a33317e0b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:31:06 compute-1 systemd[1]: run-netns-ovnmeta\x2dd7bb9409\x2d21ac\x2d404c\x2d881a\x2d401a33317e0b.mount: Deactivated successfully.
Jan 26 15:31:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:31:06.170 105448 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d7bb9409-21ac-404c-881a-401a33317e0b deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Jan 26 15:31:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:31:06.170 105448 DEBUG oslo.privsep.daemon [-] privsep: reply[bcb12467-591a-4cdf-b928-86654250dfca]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:31:06 compute-1 nova_compute[183403]: 2026-01-26 15:31:06.287 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:31:06 compute-1 nova_compute[183403]: 2026-01-26 15:31:06.481 183407 DEBUG nova.virt.libvirt.vif [None req-4b982b9b-e1af-435e-b046-d73112e7bbfe d0d771a34e2643d782edb3717de7f449 2c23d857cca949afb2559c9276298f2f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2026-01-26T15:29:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-556438860',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-556438860',id=22,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T15:29:44Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2c23d857cca949afb2559c9276298f2f',ramdisk_id='',reservation_id='r-seb1ykz9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',clean_attempts='1',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-418427150',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-418427150-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T15:30:49Z,user_data=None,user_id='d0d771a34e2643d782edb3717de7f449',uuid=2239680b-85e2-414c-a1a8-69e506b41f64,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0889ea6c-5b91-46c9-aefc-a5b43a896e33", "address": "fa:16:3e:a9:16:0a", "network": {"id": "d7bb9409-21ac-404c-881a-401a33317e0b", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1176937166-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce5a22c9e1b44c8688bb5ce1d0d3ef81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0889ea6c-5b", "ovs_interfaceid": "0889ea6c-5b91-46c9-aefc-a5b43a896e33", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 26 15:31:06 compute-1 nova_compute[183403]: 2026-01-26 15:31:06.482 183407 DEBUG nova.network.os_vif_util [None req-4b982b9b-e1af-435e-b046-d73112e7bbfe d0d771a34e2643d782edb3717de7f449 2c23d857cca949afb2559c9276298f2f - - default default] Converting VIF {"id": "0889ea6c-5b91-46c9-aefc-a5b43a896e33", "address": "fa:16:3e:a9:16:0a", "network": {"id": "d7bb9409-21ac-404c-881a-401a33317e0b", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1176937166-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce5a22c9e1b44c8688bb5ce1d0d3ef81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0889ea6c-5b", "ovs_interfaceid": "0889ea6c-5b91-46c9-aefc-a5b43a896e33", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:31:06 compute-1 nova_compute[183403]: 2026-01-26 15:31:06.483 183407 DEBUG nova.network.os_vif_util [None req-4b982b9b-e1af-435e-b046-d73112e7bbfe d0d771a34e2643d782edb3717de7f449 2c23d857cca949afb2559c9276298f2f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a9:16:0a,bridge_name='br-int',has_traffic_filtering=True,id=0889ea6c-5b91-46c9-aefc-a5b43a896e33,network=Network(d7bb9409-21ac-404c-881a-401a33317e0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0889ea6c-5b') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:31:06 compute-1 nova_compute[183403]: 2026-01-26 15:31:06.484 183407 DEBUG os_vif [None req-4b982b9b-e1af-435e-b046-d73112e7bbfe d0d771a34e2643d782edb3717de7f449 2c23d857cca949afb2559c9276298f2f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a9:16:0a,bridge_name='br-int',has_traffic_filtering=True,id=0889ea6c-5b91-46c9-aefc-a5b43a896e33,network=Network(d7bb9409-21ac-404c-881a-401a33317e0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0889ea6c-5b') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 26 15:31:06 compute-1 nova_compute[183403]: 2026-01-26 15:31:06.488 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:31:06 compute-1 nova_compute[183403]: 2026-01-26 15:31:06.489 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0889ea6c-5b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:31:06 compute-1 nova_compute[183403]: 2026-01-26 15:31:06.491 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:31:06 compute-1 nova_compute[183403]: 2026-01-26 15:31:06.493 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:31:06 compute-1 nova_compute[183403]: 2026-01-26 15:31:06.495 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:31:06 compute-1 nova_compute[183403]: 2026-01-26 15:31:06.495 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=14438c0f-14d5-43ee-ad65-6749cf806539) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:31:06 compute-1 nova_compute[183403]: 2026-01-26 15:31:06.496 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:31:06 compute-1 nova_compute[183403]: 2026-01-26 15:31:06.498 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:31:06 compute-1 nova_compute[183403]: 2026-01-26 15:31:06.500 183407 INFO os_vif [None req-4b982b9b-e1af-435e-b046-d73112e7bbfe d0d771a34e2643d782edb3717de7f449 2c23d857cca949afb2559c9276298f2f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a9:16:0a,bridge_name='br-int',has_traffic_filtering=True,id=0889ea6c-5b91-46c9-aefc-a5b43a896e33,network=Network(d7bb9409-21ac-404c-881a-401a33317e0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0889ea6c-5b')
Jan 26 15:31:06 compute-1 nova_compute[183403]: 2026-01-26 15:31:06.501 183407 INFO nova.virt.libvirt.driver [None req-4b982b9b-e1af-435e-b046-d73112e7bbfe d0d771a34e2643d782edb3717de7f449 2c23d857cca949afb2559c9276298f2f - - default default] [instance: 2239680b-85e2-414c-a1a8-69e506b41f64] Deleting instance files /var/lib/nova/instances/2239680b-85e2-414c-a1a8-69e506b41f64_del
Jan 26 15:31:06 compute-1 nova_compute[183403]: 2026-01-26 15:31:06.502 183407 INFO nova.virt.libvirt.driver [None req-4b982b9b-e1af-435e-b046-d73112e7bbfe d0d771a34e2643d782edb3717de7f449 2c23d857cca949afb2559c9276298f2f - - default default] [instance: 2239680b-85e2-414c-a1a8-69e506b41f64] Deletion of /var/lib/nova/instances/2239680b-85e2-414c-a1a8-69e506b41f64_del complete
Jan 26 15:31:07 compute-1 nova_compute[183403]: 2026-01-26 15:31:07.025 183407 INFO nova.compute.manager [None req-4b982b9b-e1af-435e-b046-d73112e7bbfe d0d771a34e2643d782edb3717de7f449 2c23d857cca949afb2559c9276298f2f - - default default] [instance: 2239680b-85e2-414c-a1a8-69e506b41f64] Took 1.34 seconds to destroy the instance on the hypervisor.
Jan 26 15:31:07 compute-1 nova_compute[183403]: 2026-01-26 15:31:07.026 183407 DEBUG oslo.service.backend._eventlet.loopingcall [None req-4b982b9b-e1af-435e-b046-d73112e7bbfe d0d771a34e2643d782edb3717de7f449 2c23d857cca949afb2559c9276298f2f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Jan 26 15:31:07 compute-1 nova_compute[183403]: 2026-01-26 15:31:07.026 183407 DEBUG nova.compute.manager [-] [instance: 2239680b-85e2-414c-a1a8-69e506b41f64] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Jan 26 15:31:07 compute-1 nova_compute[183403]: 2026-01-26 15:31:07.027 183407 DEBUG nova.network.neutron [-] [instance: 2239680b-85e2-414c-a1a8-69e506b41f64] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Jan 26 15:31:07 compute-1 nova_compute[183403]: 2026-01-26 15:31:07.027 183407 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:31:07 compute-1 nova_compute[183403]: 2026-01-26 15:31:07.134 183407 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:31:07 compute-1 nova_compute[183403]: 2026-01-26 15:31:07.965 183407 DEBUG nova.compute.manager [req-5feb4ee0-9899-4e60-9cd4-5dc7543a1420 req-0d128446-9b65-4677-a1f9-4149bbb7e5c1 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 2239680b-85e2-414c-a1a8-69e506b41f64] Received event network-vif-unplugged-0889ea6c-5b91-46c9-aefc-a5b43a896e33 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:31:07 compute-1 nova_compute[183403]: 2026-01-26 15:31:07.965 183407 DEBUG oslo_concurrency.lockutils [req-5feb4ee0-9899-4e60-9cd4-5dc7543a1420 req-0d128446-9b65-4677-a1f9-4149bbb7e5c1 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "2239680b-85e2-414c-a1a8-69e506b41f64-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:31:07 compute-1 nova_compute[183403]: 2026-01-26 15:31:07.965 183407 DEBUG oslo_concurrency.lockutils [req-5feb4ee0-9899-4e60-9cd4-5dc7543a1420 req-0d128446-9b65-4677-a1f9-4149bbb7e5c1 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "2239680b-85e2-414c-a1a8-69e506b41f64-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:31:07 compute-1 nova_compute[183403]: 2026-01-26 15:31:07.966 183407 DEBUG oslo_concurrency.lockutils [req-5feb4ee0-9899-4e60-9cd4-5dc7543a1420 req-0d128446-9b65-4677-a1f9-4149bbb7e5c1 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "2239680b-85e2-414c-a1a8-69e506b41f64-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:31:07 compute-1 nova_compute[183403]: 2026-01-26 15:31:07.966 183407 DEBUG nova.compute.manager [req-5feb4ee0-9899-4e60-9cd4-5dc7543a1420 req-0d128446-9b65-4677-a1f9-4149bbb7e5c1 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 2239680b-85e2-414c-a1a8-69e506b41f64] No waiting events found dispatching network-vif-unplugged-0889ea6c-5b91-46c9-aefc-a5b43a896e33 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:31:07 compute-1 nova_compute[183403]: 2026-01-26 15:31:07.966 183407 DEBUG nova.compute.manager [req-5feb4ee0-9899-4e60-9cd4-5dc7543a1420 req-0d128446-9b65-4677-a1f9-4149bbb7e5c1 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 2239680b-85e2-414c-a1a8-69e506b41f64] Received event network-vif-unplugged-0889ea6c-5b91-46c9-aefc-a5b43a896e33 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 15:31:07 compute-1 nova_compute[183403]: 2026-01-26 15:31:07.966 183407 DEBUG nova.compute.manager [req-5feb4ee0-9899-4e60-9cd4-5dc7543a1420 req-0d128446-9b65-4677-a1f9-4149bbb7e5c1 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 2239680b-85e2-414c-a1a8-69e506b41f64] Received event network-vif-deleted-0889ea6c-5b91-46c9-aefc-a5b43a896e33 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:31:07 compute-1 nova_compute[183403]: 2026-01-26 15:31:07.966 183407 INFO nova.compute.manager [req-5feb4ee0-9899-4e60-9cd4-5dc7543a1420 req-0d128446-9b65-4677-a1f9-4149bbb7e5c1 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 2239680b-85e2-414c-a1a8-69e506b41f64] Neutron deleted interface 0889ea6c-5b91-46c9-aefc-a5b43a896e33; detaching it from the instance and deleting it from the info cache
Jan 26 15:31:07 compute-1 nova_compute[183403]: 2026-01-26 15:31:07.966 183407 DEBUG nova.network.neutron [req-5feb4ee0-9899-4e60-9cd4-5dc7543a1420 req-0d128446-9b65-4677-a1f9-4149bbb7e5c1 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 2239680b-85e2-414c-a1a8-69e506b41f64] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:31:08 compute-1 nova_compute[183403]: 2026-01-26 15:31:08.027 183407 DEBUG nova.network.neutron [-] [instance: 2239680b-85e2-414c-a1a8-69e506b41f64] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:31:08 compute-1 nova_compute[183403]: 2026-01-26 15:31:08.474 183407 DEBUG nova.compute.manager [req-5feb4ee0-9899-4e60-9cd4-5dc7543a1420 req-0d128446-9b65-4677-a1f9-4149bbb7e5c1 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 2239680b-85e2-414c-a1a8-69e506b41f64] Detach interface failed, port_id=0889ea6c-5b91-46c9-aefc-a5b43a896e33, reason: Instance 2239680b-85e2-414c-a1a8-69e506b41f64 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Jan 26 15:31:08 compute-1 nova_compute[183403]: 2026-01-26 15:31:08.533 183407 INFO nova.compute.manager [-] [instance: 2239680b-85e2-414c-a1a8-69e506b41f64] Took 1.51 seconds to deallocate network for instance.
Jan 26 15:31:09 compute-1 nova_compute[183403]: 2026-01-26 15:31:09.052 183407 DEBUG oslo_concurrency.lockutils [None req-4b982b9b-e1af-435e-b046-d73112e7bbfe d0d771a34e2643d782edb3717de7f449 2c23d857cca949afb2559c9276298f2f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:31:09 compute-1 nova_compute[183403]: 2026-01-26 15:31:09.052 183407 DEBUG oslo_concurrency.lockutils [None req-4b982b9b-e1af-435e-b046-d73112e7bbfe d0d771a34e2643d782edb3717de7f449 2c23d857cca949afb2559c9276298f2f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:31:09 compute-1 nova_compute[183403]: 2026-01-26 15:31:09.059 183407 DEBUG oslo_concurrency.lockutils [None req-4b982b9b-e1af-435e-b046-d73112e7bbfe d0d771a34e2643d782edb3717de7f449 2c23d857cca949afb2559c9276298f2f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.007s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:31:09 compute-1 nova_compute[183403]: 2026-01-26 15:31:09.096 183407 INFO nova.scheduler.client.report [None req-4b982b9b-e1af-435e-b046-d73112e7bbfe d0d771a34e2643d782edb3717de7f449 2c23d857cca949afb2559c9276298f2f - - default default] Deleted allocations for instance 2239680b-85e2-414c-a1a8-69e506b41f64
Jan 26 15:31:10 compute-1 nova_compute[183403]: 2026-01-26 15:31:10.125 183407 DEBUG oslo_concurrency.lockutils [None req-4b982b9b-e1af-435e-b046-d73112e7bbfe d0d771a34e2643d782edb3717de7f449 2c23d857cca949afb2559c9276298f2f - - default default] Lock "2239680b-85e2-414c-a1a8-69e506b41f64" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:31:11 compute-1 nova_compute[183403]: 2026-01-26 15:31:11.319 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:31:11 compute-1 nova_compute[183403]: 2026-01-26 15:31:11.498 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:31:16 compute-1 nova_compute[183403]: 2026-01-26 15:31:16.447 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:31:16 compute-1 nova_compute[183403]: 2026-01-26 15:31:16.500 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:31:17 compute-1 nova_compute[183403]: 2026-01-26 15:31:17.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:31:19 compute-1 openstack_network_exporter[195610]: ERROR   15:31:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:31:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:31:19 compute-1 openstack_network_exporter[195610]: ERROR   15:31:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:31:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:31:19 compute-1 nova_compute[183403]: 2026-01-26 15:31:19.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:31:21 compute-1 nova_compute[183403]: 2026-01-26 15:31:21.490 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:31:21 compute-1 nova_compute[183403]: 2026-01-26 15:31:21.501 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:31:22 compute-1 nova_compute[183403]: 2026-01-26 15:31:22.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:31:22 compute-1 nova_compute[183403]: 2026-01-26 15:31:22.577 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Jan 26 15:31:23 compute-1 nova_compute[183403]: 2026-01-26 15:31:23.084 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Jan 26 15:31:24 compute-1 podman[212429]: 2026-01-26 15:31:24.907282669 +0000 UTC m=+0.081791911 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 26 15:31:24 compute-1 podman[212430]: 2026-01-26 15:31:24.917618937 +0000 UTC m=+0.092009566 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.openshift.tags=minimal rhel9, version=9.6, config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, release=1755695350, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, distribution-scope=public)
Jan 26 15:31:25 compute-1 nova_compute[183403]: 2026-01-26 15:31:25.086 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:31:25 compute-1 nova_compute[183403]: 2026-01-26 15:31:25.086 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:31:25 compute-1 nova_compute[183403]: 2026-01-26 15:31:25.754 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:31:25 compute-1 nova_compute[183403]: 2026-01-26 15:31:25.754 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:31:25 compute-1 nova_compute[183403]: 2026-01-26 15:31:25.755 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:31:25 compute-1 nova_compute[183403]: 2026-01-26 15:31:25.755 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:31:25 compute-1 nova_compute[183403]: 2026-01-26 15:31:25.995 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:31:25 compute-1 nova_compute[183403]: 2026-01-26 15:31:25.996 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:31:26 compute-1 nova_compute[183403]: 2026-01-26 15:31:26.015 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.019s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:31:26 compute-1 nova_compute[183403]: 2026-01-26 15:31:26.016 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5831MB free_disk=73.14489364624023GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:31:26 compute-1 nova_compute[183403]: 2026-01-26 15:31:26.016 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:31:26 compute-1 nova_compute[183403]: 2026-01-26 15:31:26.017 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:31:26 compute-1 nova_compute[183403]: 2026-01-26 15:31:26.502 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:31:26 compute-1 nova_compute[183403]: 2026-01-26 15:31:26.503 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:31:26 compute-1 nova_compute[183403]: 2026-01-26 15:31:26.504 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Jan 26 15:31:26 compute-1 nova_compute[183403]: 2026-01-26 15:31:26.504 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 26 15:31:26 compute-1 nova_compute[183403]: 2026-01-26 15:31:26.531 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:31:26 compute-1 nova_compute[183403]: 2026-01-26 15:31:26.532 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 26 15:31:27 compute-1 nova_compute[183403]: 2026-01-26 15:31:27.058 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:31:27 compute-1 nova_compute[183403]: 2026-01-26 15:31:27.058 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:31:26 up  1:26,  0 user,  load average: 0.02, 0.16, 0.23\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:31:27 compute-1 nova_compute[183403]: 2026-01-26 15:31:27.077 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:31:27 compute-1 nova_compute[183403]: 2026-01-26 15:31:27.585 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:31:28 compute-1 nova_compute[183403]: 2026-01-26 15:31:28.095 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:31:28 compute-1 nova_compute[183403]: 2026-01-26 15:31:28.095 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.079s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:31:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:31:29.083 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:31:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:31:29.083 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:31:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:31:29.084 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:31:30 compute-1 nova_compute[183403]: 2026-01-26 15:31:30.584 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:31:30 compute-1 nova_compute[183403]: 2026-01-26 15:31:30.585 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:31:30 compute-1 nova_compute[183403]: 2026-01-26 15:31:30.586 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:31:30 compute-1 nova_compute[183403]: 2026-01-26 15:31:30.586 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:31:30 compute-1 nova_compute[183403]: 2026-01-26 15:31:30.587 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 15:31:31 compute-1 nova_compute[183403]: 2026-01-26 15:31:31.533 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:31:31 compute-1 nova_compute[183403]: 2026-01-26 15:31:31.537 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:31:31 compute-1 nova_compute[183403]: 2026-01-26 15:31:31.537 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Jan 26 15:31:31 compute-1 nova_compute[183403]: 2026-01-26 15:31:31.537 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 26 15:31:31 compute-1 nova_compute[183403]: 2026-01-26 15:31:31.570 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:31:31 compute-1 nova_compute[183403]: 2026-01-26 15:31:31.573 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 26 15:31:32 compute-1 nova_compute[183403]: 2026-01-26 15:31:32.577 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:31:32 compute-1 nova_compute[183403]: 2026-01-26 15:31:32.577 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Jan 26 15:31:35 compute-1 podman[192725]: time="2026-01-26T15:31:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:31:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:31:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:31:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:31:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2191 "" "Go-http-client/1.1"
Jan 26 15:31:35 compute-1 podman[212481]: 2026-01-26 15:31:35.927265181 +0000 UTC m=+0.092402057 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20260120)
Jan 26 15:31:35 compute-1 podman[212480]: 2026-01-26 15:31:35.963891046 +0000 UTC m=+0.136813000 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Jan 26 15:31:36 compute-1 nova_compute[183403]: 2026-01-26 15:31:36.573 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:31:37 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:31:37.032 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=28, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:ac:38', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'de:96:40:90:f3:e5'}, ipsec=False) old=SB_Global(nb_cfg=27) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:31:37 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:31:37.032 104930 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 15:31:37 compute-1 nova_compute[183403]: 2026-01-26 15:31:37.033 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:31:41 compute-1 nova_compute[183403]: 2026-01-26 15:31:41.574 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:31:43 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:31:43.034 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=41380b5a-e321-4ce4-bcc6-ecd563b3c793, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '28'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:31:44 compute-1 nova_compute[183403]: 2026-01-26 15:31:44.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:31:46 compute-1 nova_compute[183403]: 2026-01-26 15:31:46.577 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:31:46 compute-1 nova_compute[183403]: 2026-01-26 15:31:46.579 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:31:46 compute-1 nova_compute[183403]: 2026-01-26 15:31:46.579 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Jan 26 15:31:46 compute-1 nova_compute[183403]: 2026-01-26 15:31:46.579 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 26 15:31:46 compute-1 nova_compute[183403]: 2026-01-26 15:31:46.604 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:31:46 compute-1 nova_compute[183403]: 2026-01-26 15:31:46.605 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 26 15:31:49 compute-1 openstack_network_exporter[195610]: ERROR   15:31:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:31:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:31:49 compute-1 openstack_network_exporter[195610]: ERROR   15:31:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:31:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:31:51 compute-1 nova_compute[183403]: 2026-01-26 15:31:51.605 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:31:51 compute-1 nova_compute[183403]: 2026-01-26 15:31:51.607 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:31:55 compute-1 podman[212529]: 2026-01-26 15:31:55.903401193 +0000 UTC m=+0.071283858 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 15:31:55 compute-1 podman[212530]: 2026-01-26 15:31:55.938026344 +0000 UTC m=+0.098275714 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, architecture=x86_64, container_name=openstack_network_exporter, vendor=Red Hat, Inc., version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal)
Jan 26 15:31:56 compute-1 nova_compute[183403]: 2026-01-26 15:31:56.609 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:31:56 compute-1 nova_compute[183403]: 2026-01-26 15:31:56.611 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:31:56 compute-1 nova_compute[183403]: 2026-01-26 15:31:56.612 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Jan 26 15:31:56 compute-1 nova_compute[183403]: 2026-01-26 15:31:56.612 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 26 15:31:56 compute-1 nova_compute[183403]: 2026-01-26 15:31:56.650 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:31:56 compute-1 nova_compute[183403]: 2026-01-26 15:31:56.651 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 26 15:32:00 compute-1 nova_compute[183403]: 2026-01-26 15:32:00.498 183407 DEBUG nova.virt.libvirt.driver [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 0eb49996-7b21-4728-a0c0-cf817cd788e6] Creating tmpfile /var/lib/nova/instances/tmp3phlm5e9 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Jan 26 15:32:00 compute-1 nova_compute[183403]: 2026-01-26 15:32:00.499 183407 WARNING neutronclient.v2_0.client [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:32:00 compute-1 nova_compute[183403]: 2026-01-26 15:32:00.511 183407 DEBUG nova.compute.manager [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp3phlm5e9',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9090
Jan 26 15:32:01 compute-1 nova_compute[183403]: 2026-01-26 15:32:01.652 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:32:02 compute-1 nova_compute[183403]: 2026-01-26 15:32:02.553 183407 WARNING neutronclient.v2_0.client [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:32:02 compute-1 sshd-session[212573]: Connection closed by 80.94.92.168 port 50216
Jan 26 15:32:05 compute-1 podman[192725]: time="2026-01-26T15:32:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:32:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:32:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:32:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:32:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2193 "" "Go-http-client/1.1"
Jan 26 15:32:06 compute-1 nova_compute[183403]: 2026-01-26 15:32:06.435 183407 DEBUG nova.compute.manager [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp3phlm5e9',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='0eb49996-7b21-4728-a0c0-cf817cd788e6',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9315
Jan 26 15:32:06 compute-1 nova_compute[183403]: 2026-01-26 15:32:06.654 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:32:06 compute-1 nova_compute[183403]: 2026-01-26 15:32:06.656 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:32:06 compute-1 nova_compute[183403]: 2026-01-26 15:32:06.656 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Jan 26 15:32:06 compute-1 nova_compute[183403]: 2026-01-26 15:32:06.657 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 26 15:32:06 compute-1 nova_compute[183403]: 2026-01-26 15:32:06.702 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:32:06 compute-1 nova_compute[183403]: 2026-01-26 15:32:06.704 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 26 15:32:06 compute-1 podman[212575]: 2026-01-26 15:32:06.933158086 +0000 UTC m=+0.093628099 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Jan 26 15:32:06 compute-1 podman[212574]: 2026-01-26 15:32:06.942149728 +0000 UTC m=+0.115704473 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Jan 26 15:32:07 compute-1 nova_compute[183403]: 2026-01-26 15:32:07.453 183407 DEBUG oslo_concurrency.lockutils [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "refresh_cache-0eb49996-7b21-4728-a0c0-cf817cd788e6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 15:32:07 compute-1 nova_compute[183403]: 2026-01-26 15:32:07.454 183407 DEBUG oslo_concurrency.lockutils [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquired lock "refresh_cache-0eb49996-7b21-4728-a0c0-cf817cd788e6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 15:32:07 compute-1 nova_compute[183403]: 2026-01-26 15:32:07.454 183407 DEBUG nova.network.neutron [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 0eb49996-7b21-4728-a0c0-cf817cd788e6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 15:32:07 compute-1 nova_compute[183403]: 2026-01-26 15:32:07.962 183407 WARNING neutronclient.v2_0.client [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:32:08 compute-1 nova_compute[183403]: 2026-01-26 15:32:08.586 183407 WARNING neutronclient.v2_0.client [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:32:08 compute-1 nova_compute[183403]: 2026-01-26 15:32:08.898 183407 DEBUG nova.network.neutron [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 0eb49996-7b21-4728-a0c0-cf817cd788e6] Updating instance_info_cache with network_info: [{"id": "058045bb-dfca-4150-8a79-85fb7fad72ee", "address": "fa:16:3e:8a:02:63", "network": {"id": "d7bb9409-21ac-404c-881a-401a33317e0b", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1176937166-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce5a22c9e1b44c8688bb5ce1d0d3ef81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap058045bb-df", "ovs_interfaceid": "058045bb-dfca-4150-8a79-85fb7fad72ee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:32:09 compute-1 nova_compute[183403]: 2026-01-26 15:32:09.407 183407 DEBUG oslo_concurrency.lockutils [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Releasing lock "refresh_cache-0eb49996-7b21-4728-a0c0-cf817cd788e6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 15:32:09 compute-1 nova_compute[183403]: 2026-01-26 15:32:09.436 183407 DEBUG nova.virt.libvirt.driver [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 0eb49996-7b21-4728-a0c0-cf817cd788e6] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp3phlm5e9',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='0eb49996-7b21-4728-a0c0-cf817cd788e6',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Jan 26 15:32:09 compute-1 nova_compute[183403]: 2026-01-26 15:32:09.437 183407 DEBUG nova.virt.libvirt.driver [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 0eb49996-7b21-4728-a0c0-cf817cd788e6] Creating instance directory: /var/lib/nova/instances/0eb49996-7b21-4728-a0c0-cf817cd788e6 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Jan 26 15:32:09 compute-1 nova_compute[183403]: 2026-01-26 15:32:09.437 183407 DEBUG nova.virt.libvirt.driver [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 0eb49996-7b21-4728-a0c0-cf817cd788e6] Creating disk.info with the contents: {'/var/lib/nova/instances/0eb49996-7b21-4728-a0c0-cf817cd788e6/disk': 'qcow2', '/var/lib/nova/instances/0eb49996-7b21-4728-a0c0-cf817cd788e6/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Jan 26 15:32:09 compute-1 nova_compute[183403]: 2026-01-26 15:32:09.438 183407 DEBUG nova.virt.libvirt.driver [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 0eb49996-7b21-4728-a0c0-cf817cd788e6] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Jan 26 15:32:09 compute-1 nova_compute[183403]: 2026-01-26 15:32:09.439 183407 DEBUG nova.objects.instance [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lazy-loading 'trusted_certs' on Instance uuid 0eb49996-7b21-4728-a0c0-cf817cd788e6 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:32:09 compute-1 nova_compute[183403]: 2026-01-26 15:32:09.947 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:32:09 compute-1 nova_compute[183403]: 2026-01-26 15:32:09.951 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:32:09 compute-1 nova_compute[183403]: 2026-01-26 15:32:09.953 183407 DEBUG oslo_concurrency.processutils [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:32:10 compute-1 nova_compute[183403]: 2026-01-26 15:32:10.006 183407 DEBUG oslo_concurrency.processutils [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:32:10 compute-1 nova_compute[183403]: 2026-01-26 15:32:10.008 183407 DEBUG oslo_concurrency.lockutils [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:32:10 compute-1 nova_compute[183403]: 2026-01-26 15:32:10.009 183407 DEBUG oslo_concurrency.lockutils [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:32:10 compute-1 nova_compute[183403]: 2026-01-26 15:32:10.009 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:32:10 compute-1 nova_compute[183403]: 2026-01-26 15:32:10.016 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:32:10 compute-1 nova_compute[183403]: 2026-01-26 15:32:10.017 183407 DEBUG oslo_concurrency.processutils [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:32:10 compute-1 nova_compute[183403]: 2026-01-26 15:32:10.116 183407 DEBUG oslo_concurrency.processutils [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.099s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:32:10 compute-1 nova_compute[183403]: 2026-01-26 15:32:10.117 183407 DEBUG oslo_concurrency.processutils [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0,backing_fmt=raw /var/lib/nova/instances/0eb49996-7b21-4728-a0c0-cf817cd788e6/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:32:10 compute-1 nova_compute[183403]: 2026-01-26 15:32:10.165 183407 DEBUG oslo_concurrency.processutils [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0,backing_fmt=raw /var/lib/nova/instances/0eb49996-7b21-4728-a0c0-cf817cd788e6/disk 1073741824" returned: 0 in 0.049s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:32:10 compute-1 nova_compute[183403]: 2026-01-26 15:32:10.167 183407 DEBUG oslo_concurrency.lockutils [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.158s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:32:10 compute-1 nova_compute[183403]: 2026-01-26 15:32:10.168 183407 DEBUG oslo_concurrency.processutils [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:32:10 compute-1 nova_compute[183403]: 2026-01-26 15:32:10.252 183407 DEBUG oslo_concurrency.processutils [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:32:10 compute-1 nova_compute[183403]: 2026-01-26 15:32:10.254 183407 DEBUG nova.virt.disk.api [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Checking if we can resize image /var/lib/nova/instances/0eb49996-7b21-4728-a0c0-cf817cd788e6/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 26 15:32:10 compute-1 nova_compute[183403]: 2026-01-26 15:32:10.254 183407 DEBUG oslo_concurrency.processutils [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0eb49996-7b21-4728-a0c0-cf817cd788e6/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:32:10 compute-1 nova_compute[183403]: 2026-01-26 15:32:10.325 183407 DEBUG oslo_concurrency.processutils [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0eb49996-7b21-4728-a0c0-cf817cd788e6/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:32:10 compute-1 nova_compute[183403]: 2026-01-26 15:32:10.327 183407 DEBUG nova.virt.disk.api [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Cannot resize image /var/lib/nova/instances/0eb49996-7b21-4728-a0c0-cf817cd788e6/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 26 15:32:10 compute-1 nova_compute[183403]: 2026-01-26 15:32:10.328 183407 DEBUG nova.objects.instance [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lazy-loading 'migration_context' on Instance uuid 0eb49996-7b21-4728-a0c0-cf817cd788e6 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:32:10 compute-1 nova_compute[183403]: 2026-01-26 15:32:10.848 183407 DEBUG nova.objects.base [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Object Instance<0eb49996-7b21-4728-a0c0-cf817cd788e6> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Jan 26 15:32:10 compute-1 nova_compute[183403]: 2026-01-26 15:32:10.849 183407 DEBUG oslo_concurrency.processutils [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/0eb49996-7b21-4728-a0c0-cf817cd788e6/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:32:10 compute-1 nova_compute[183403]: 2026-01-26 15:32:10.890 183407 DEBUG oslo_concurrency.processutils [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/0eb49996-7b21-4728-a0c0-cf817cd788e6/disk.config 497664" returned: 0 in 0.040s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:32:10 compute-1 nova_compute[183403]: 2026-01-26 15:32:10.891 183407 DEBUG nova.virt.libvirt.driver [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 0eb49996-7b21-4728-a0c0-cf817cd788e6] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Jan 26 15:32:10 compute-1 nova_compute[183403]: 2026-01-26 15:32:10.893 183407 DEBUG nova.virt.libvirt.vif [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-26T15:31:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-1497496386',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(3),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-1497496386',id=24,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=3,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T15:31:27Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1151,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='2c23d857cca949afb2559c9276298f2f',ramdisk_id='',reservation_id='r-poy1adt1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-418427150',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-418427150-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T15:31:27Z,user_data=None,user_id='d0d771a34e2643d782edb3717de7f449',uuid=0eb49996-7b21-4728-a0c0-cf817cd788e6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "058045bb-dfca-4150-8a79-85fb7fad72ee", "address": "fa:16:3e:8a:02:63", "network": {"id": "d7bb9409-21ac-404c-881a-401a33317e0b", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1176937166-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce5a22c9e1b44c8688bb5ce1d0d3ef81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap058045bb-df", "ovs_interfaceid": "058045bb-dfca-4150-8a79-85fb7fad72ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 26 15:32:10 compute-1 nova_compute[183403]: 2026-01-26 15:32:10.894 183407 DEBUG nova.network.os_vif_util [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Converting VIF {"id": "058045bb-dfca-4150-8a79-85fb7fad72ee", "address": "fa:16:3e:8a:02:63", "network": {"id": "d7bb9409-21ac-404c-881a-401a33317e0b", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1176937166-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce5a22c9e1b44c8688bb5ce1d0d3ef81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap058045bb-df", "ovs_interfaceid": "058045bb-dfca-4150-8a79-85fb7fad72ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:32:10 compute-1 nova_compute[183403]: 2026-01-26 15:32:10.895 183407 DEBUG nova.network.os_vif_util [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8a:02:63,bridge_name='br-int',has_traffic_filtering=True,id=058045bb-dfca-4150-8a79-85fb7fad72ee,network=Network(d7bb9409-21ac-404c-881a-401a33317e0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap058045bb-df') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:32:10 compute-1 nova_compute[183403]: 2026-01-26 15:32:10.896 183407 DEBUG os_vif [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8a:02:63,bridge_name='br-int',has_traffic_filtering=True,id=058045bb-dfca-4150-8a79-85fb7fad72ee,network=Network(d7bb9409-21ac-404c-881a-401a33317e0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap058045bb-df') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 26 15:32:10 compute-1 nova_compute[183403]: 2026-01-26 15:32:10.898 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:32:10 compute-1 nova_compute[183403]: 2026-01-26 15:32:10.899 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:32:10 compute-1 nova_compute[183403]: 2026-01-26 15:32:10.900 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:32:10 compute-1 nova_compute[183403]: 2026-01-26 15:32:10.901 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:32:10 compute-1 nova_compute[183403]: 2026-01-26 15:32:10.901 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '3979e7b6-a96e-5917-b737-cca73830750e', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:32:10 compute-1 nova_compute[183403]: 2026-01-26 15:32:10.903 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:32:10 compute-1 nova_compute[183403]: 2026-01-26 15:32:10.905 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:32:10 compute-1 nova_compute[183403]: 2026-01-26 15:32:10.909 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:32:10 compute-1 nova_compute[183403]: 2026-01-26 15:32:10.910 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap058045bb-df, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:32:10 compute-1 nova_compute[183403]: 2026-01-26 15:32:10.911 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap058045bb-df, col_values=(('qos', UUID('00ee4606-6741-40b7-adb5-43f87022b533')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:32:10 compute-1 nova_compute[183403]: 2026-01-26 15:32:10.911 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap058045bb-df, col_values=(('external_ids', {'iface-id': '058045bb-dfca-4150-8a79-85fb7fad72ee', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8a:02:63', 'vm-uuid': '0eb49996-7b21-4728-a0c0-cf817cd788e6'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:32:10 compute-1 nova_compute[183403]: 2026-01-26 15:32:10.913 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:32:10 compute-1 NetworkManager[55716]: <info>  [1769441530.9144] manager: (tap058045bb-df): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/71)
Jan 26 15:32:10 compute-1 nova_compute[183403]: 2026-01-26 15:32:10.916 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:32:10 compute-1 nova_compute[183403]: 2026-01-26 15:32:10.920 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:32:10 compute-1 nova_compute[183403]: 2026-01-26 15:32:10.922 183407 INFO os_vif [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8a:02:63,bridge_name='br-int',has_traffic_filtering=True,id=058045bb-dfca-4150-8a79-85fb7fad72ee,network=Network(d7bb9409-21ac-404c-881a-401a33317e0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap058045bb-df')
Jan 26 15:32:10 compute-1 nova_compute[183403]: 2026-01-26 15:32:10.923 183407 DEBUG nova.virt.libvirt.driver [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Jan 26 15:32:10 compute-1 nova_compute[183403]: 2026-01-26 15:32:10.923 183407 DEBUG nova.compute.manager [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp3phlm5e9',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='0eb49996-7b21-4728-a0c0-cf817cd788e6',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9381
Jan 26 15:32:10 compute-1 nova_compute[183403]: 2026-01-26 15:32:10.924 183407 WARNING neutronclient.v2_0.client [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:32:11 compute-1 nova_compute[183403]: 2026-01-26 15:32:11.226 183407 WARNING neutronclient.v2_0.client [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:32:11 compute-1 nova_compute[183403]: 2026-01-26 15:32:11.708 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:32:12 compute-1 nova_compute[183403]: 2026-01-26 15:32:12.289 183407 DEBUG nova.network.neutron [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 0eb49996-7b21-4728-a0c0-cf817cd788e6] Port 058045bb-dfca-4150-8a79-85fb7fad72ee updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Jan 26 15:32:12 compute-1 nova_compute[183403]: 2026-01-26 15:32:12.305 183407 DEBUG nova.compute.manager [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp3phlm5e9',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='0eb49996-7b21-4728-a0c0-cf817cd788e6',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9447
Jan 26 15:32:15 compute-1 ovn_controller[95641]: 2026-01-26T15:32:15Z|00183|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Jan 26 15:32:15 compute-1 NetworkManager[55716]: <info>  [1769441535.7518] manager: (tap058045bb-df): new Tun device (/org/freedesktop/NetworkManager/Devices/72)
Jan 26 15:32:15 compute-1 kernel: tap058045bb-df: entered promiscuous mode
Jan 26 15:32:15 compute-1 ovn_controller[95641]: 2026-01-26T15:32:15Z|00184|binding|INFO|Claiming lport 058045bb-dfca-4150-8a79-85fb7fad72ee for this additional chassis.
Jan 26 15:32:15 compute-1 ovn_controller[95641]: 2026-01-26T15:32:15Z|00185|binding|INFO|058045bb-dfca-4150-8a79-85fb7fad72ee: Claiming fa:16:3e:8a:02:63 10.100.0.5
Jan 26 15:32:15 compute-1 nova_compute[183403]: 2026-01-26 15:32:15.755 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:32:15 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:32:15.762 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8a:02:63 10.100.0.5'], port_security=['fa:16:3e:8a:02:63 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '0eb49996-7b21-4728-a0c0-cf817cd788e6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7bb9409-21ac-404c-881a-401a33317e0b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2c23d857cca949afb2559c9276298f2f', 'neutron:revision_number': '10', 'neutron:security_group_ids': '2816b4e0-6d42-4df2-a497-a52d8e0e90c8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2fe2c46a-3344-4b49-9cc4-4db510e2e673, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=058045bb-dfca-4150-8a79-85fb7fad72ee) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:32:15 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:32:15.763 104930 INFO neutron.agent.ovn.metadata.agent [-] Port 058045bb-dfca-4150-8a79-85fb7fad72ee in datapath d7bb9409-21ac-404c-881a-401a33317e0b unbound from our chassis
Jan 26 15:32:15 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:32:15.764 104930 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d7bb9409-21ac-404c-881a-401a33317e0b
Jan 26 15:32:15 compute-1 ovn_controller[95641]: 2026-01-26T15:32:15Z|00186|binding|INFO|Setting lport 058045bb-dfca-4150-8a79-85fb7fad72ee ovn-installed in OVS
Jan 26 15:32:15 compute-1 nova_compute[183403]: 2026-01-26 15:32:15.768 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:32:15 compute-1 nova_compute[183403]: 2026-01-26 15:32:15.771 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:32:15 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:32:15.776 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[ca5cea55-315e-4f1f-a1ac-e7e282b7cde1]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:32:15 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:32:15.777 104930 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd7bb9409-21 in ovnmeta-d7bb9409-21ac-404c-881a-401a33317e0b namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Jan 26 15:32:15 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:32:15.778 203506 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd7bb9409-20 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Jan 26 15:32:15 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:32:15.778 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[f86fb0c5-571e-4ef9-ad81-ae39a88911b9]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:32:15 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:32:15.779 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[e4155baf-4023-462c-9cfc-48623cc1dfe8]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:32:15 compute-1 systemd-machined[154697]: New machine qemu-17-instance-00000018.
Jan 26 15:32:15 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:32:15.790 105448 DEBUG oslo.privsep.daemon [-] privsep: reply[834f7d72-a024-4349-bbb4-b33734c81e64]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:32:15 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:32:15.805 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[9146d088-366d-4c33-abb7-1edb3080c2a2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:32:15 compute-1 systemd[1]: Started Virtual Machine qemu-17-instance-00000018.
Jan 26 15:32:15 compute-1 systemd-udevd[212660]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 15:32:15 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:32:15.830 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[e1fe2b67-0260-44a6-b1f0-7e20607c0b0e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:32:15 compute-1 NetworkManager[55716]: <info>  [1769441535.8349] device (tap058045bb-df): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 15:32:15 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:32:15.835 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[c6c3a503-cc83-40e0-ba99-e557990d6cbf]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:32:15 compute-1 NetworkManager[55716]: <info>  [1769441535.8369] manager: (tapd7bb9409-20): new Veth device (/org/freedesktop/NetworkManager/Devices/73)
Jan 26 15:32:15 compute-1 systemd-udevd[212662]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 15:32:15 compute-1 NetworkManager[55716]: <info>  [1769441535.8375] device (tap058045bb-df): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 15:32:15 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:32:15.865 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[f530786b-4b90-45af-9aa8-38722ff374dc]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:32:15 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:32:15.869 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[b7532dd1-48d8-4775-ab58-bfa5f9652973]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:32:15 compute-1 NetworkManager[55716]: <info>  [1769441535.8889] device (tapd7bb9409-20): carrier: link connected
Jan 26 15:32:15 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:32:15.894 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[503b7e56-9b47-442f-99c3-a5ccb36dc0e6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:32:15 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:32:15.912 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[e6a55fa7-52e5-4cf3-ab74-13919d348cd3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7bb9409-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1a:8f:ee'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 53], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 525849, 'reachable_time': 36066, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212687, 'error': None, 'target': 'ovnmeta-d7bb9409-21ac-404c-881a-401a33317e0b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:32:15 compute-1 nova_compute[183403]: 2026-01-26 15:32:15.913 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:32:15 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:32:15.926 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[0f8182ba-7da1-45c7-884b-1fa0b9da784d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1a:8fee'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 525849, 'tstamp': 525849}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212689, 'error': None, 'target': 'ovnmeta-d7bb9409-21ac-404c-881a-401a33317e0b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:32:15 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:32:15.940 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[c1b5d0e1-cfaa-4228-826f-a0dd0d40062c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7bb9409-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1a:8f:ee'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 53], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 525849, 'reachable_time': 36066, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 212690, 'error': None, 'target': 'ovnmeta-d7bb9409-21ac-404c-881a-401a33317e0b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:32:15 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:32:15.976 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[045ef60c-bc43-4bea-91f9-60c0822174dd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:32:16 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:32:16.056 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[67a29fed-8d99-4dc9-98cd-cca321dac886]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:32:16 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:32:16.057 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7bb9409-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:32:16 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:32:16.057 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:32:16 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:32:16.057 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd7bb9409-20, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:32:16 compute-1 kernel: tapd7bb9409-20: entered promiscuous mode
Jan 26 15:32:16 compute-1 NetworkManager[55716]: <info>  [1769441536.0591] manager: (tapd7bb9409-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/74)
Jan 26 15:32:16 compute-1 nova_compute[183403]: 2026-01-26 15:32:16.058 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:32:16 compute-1 nova_compute[183403]: 2026-01-26 15:32:16.060 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:32:16 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:32:16.061 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd7bb9409-20, col_values=(('external_ids', {'iface-id': 'fe81a3ca-59e0-4072-a668-b5a59e5f3940'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:32:16 compute-1 nova_compute[183403]: 2026-01-26 15:32:16.062 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:32:16 compute-1 ovn_controller[95641]: 2026-01-26T15:32:16Z|00187|binding|INFO|Releasing lport fe81a3ca-59e0-4072-a668-b5a59e5f3940 from this chassis (sb_readonly=0)
Jan 26 15:32:16 compute-1 nova_compute[183403]: 2026-01-26 15:32:16.084 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:32:16 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:32:16.086 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[bd596ef3-da23-4737-aa4c-00c27eb7262f]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:32:16 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:32:16.087 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d7bb9409-21ac-404c-881a-401a33317e0b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d7bb9409-21ac-404c-881a-401a33317e0b.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:32:16 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:32:16.087 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d7bb9409-21ac-404c-881a-401a33317e0b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d7bb9409-21ac-404c-881a-401a33317e0b.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:32:16 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:32:16.087 104930 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for d7bb9409-21ac-404c-881a-401a33317e0b disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Jan 26 15:32:16 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:32:16.087 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d7bb9409-21ac-404c-881a-401a33317e0b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d7bb9409-21ac-404c-881a-401a33317e0b.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:32:16 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:32:16.088 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[a698bb46-e157-4a2f-b606-d14e174ca61c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:32:16 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:32:16.088 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d7bb9409-21ac-404c-881a-401a33317e0b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d7bb9409-21ac-404c-881a-401a33317e0b.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:32:16 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:32:16.088 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[595693f2-c7dc-4e94-98d6-2e0bfb9804d5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:32:16 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:32:16.089 104930 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Jan 26 15:32:16 compute-1 ovn_metadata_agent[104924]: global
Jan 26 15:32:16 compute-1 ovn_metadata_agent[104924]:     log         /dev/log local0 debug
Jan 26 15:32:16 compute-1 ovn_metadata_agent[104924]:     log-tag     haproxy-metadata-proxy-d7bb9409-21ac-404c-881a-401a33317e0b
Jan 26 15:32:16 compute-1 ovn_metadata_agent[104924]:     user        root
Jan 26 15:32:16 compute-1 ovn_metadata_agent[104924]:     group       root
Jan 26 15:32:16 compute-1 ovn_metadata_agent[104924]:     maxconn     1024
Jan 26 15:32:16 compute-1 ovn_metadata_agent[104924]:     pidfile     /var/lib/neutron/external/pids/d7bb9409-21ac-404c-881a-401a33317e0b.pid.haproxy
Jan 26 15:32:16 compute-1 ovn_metadata_agent[104924]:     daemon
Jan 26 15:32:16 compute-1 ovn_metadata_agent[104924]: 
Jan 26 15:32:16 compute-1 ovn_metadata_agent[104924]: defaults
Jan 26 15:32:16 compute-1 ovn_metadata_agent[104924]:     log global
Jan 26 15:32:16 compute-1 ovn_metadata_agent[104924]:     mode http
Jan 26 15:32:16 compute-1 ovn_metadata_agent[104924]:     option httplog
Jan 26 15:32:16 compute-1 ovn_metadata_agent[104924]:     option dontlognull
Jan 26 15:32:16 compute-1 ovn_metadata_agent[104924]:     option http-server-close
Jan 26 15:32:16 compute-1 ovn_metadata_agent[104924]:     option forwardfor
Jan 26 15:32:16 compute-1 ovn_metadata_agent[104924]:     retries                 3
Jan 26 15:32:16 compute-1 ovn_metadata_agent[104924]:     timeout http-request    30s
Jan 26 15:32:16 compute-1 ovn_metadata_agent[104924]:     timeout connect         30s
Jan 26 15:32:16 compute-1 ovn_metadata_agent[104924]:     timeout client          32s
Jan 26 15:32:16 compute-1 ovn_metadata_agent[104924]:     timeout server          32s
Jan 26 15:32:16 compute-1 ovn_metadata_agent[104924]:     timeout http-keep-alive 30s
Jan 26 15:32:16 compute-1 ovn_metadata_agent[104924]: 
Jan 26 15:32:16 compute-1 ovn_metadata_agent[104924]: listen listener
Jan 26 15:32:16 compute-1 ovn_metadata_agent[104924]:     bind 169.254.169.254:80
Jan 26 15:32:16 compute-1 ovn_metadata_agent[104924]:     
Jan 26 15:32:16 compute-1 ovn_metadata_agent[104924]:     server metadata /var/lib/neutron/metadata_proxy
Jan 26 15:32:16 compute-1 ovn_metadata_agent[104924]: 
Jan 26 15:32:16 compute-1 ovn_metadata_agent[104924]:     http-request add-header X-OVN-Network-ID d7bb9409-21ac-404c-881a-401a33317e0b
Jan 26 15:32:16 compute-1 ovn_metadata_agent[104924]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Jan 26 15:32:16 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:32:16.089 104930 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d7bb9409-21ac-404c-881a-401a33317e0b', 'env', 'PROCESS_TAG=haproxy-d7bb9409-21ac-404c-881a-401a33317e0b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d7bb9409-21ac-404c-881a-401a33317e0b.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Jan 26 15:32:16 compute-1 podman[212729]: 2026-01-26 15:32:16.523606746 +0000 UTC m=+0.029354381 image pull d5bf96c5225682608353c2a38183b39c74c7c48343b54a579b3b6f3d81996637 38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Jan 26 15:32:16 compute-1 nova_compute[183403]: 2026-01-26 15:32:16.763 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:32:16 compute-1 podman[212729]: 2026-01-26 15:32:16.845960597 +0000 UTC m=+0.351708192 container create f79861152912162b48bc02f07313703f27573210c5e2578f39059d32cd7d539d (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d7bb9409-21ac-404c-881a-401a33317e0b, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260120)
Jan 26 15:32:16 compute-1 systemd[1]: Started libpod-conmon-f79861152912162b48bc02f07313703f27573210c5e2578f39059d32cd7d539d.scope.
Jan 26 15:32:16 compute-1 systemd[1]: Started libcrun container.
Jan 26 15:32:16 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/54ffde7b6a96ab9aab23155087a2fd768bf77ee1623ad5aa6ad3ddb5814dbcfd/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 15:32:16 compute-1 podman[212729]: 2026-01-26 15:32:16.957473867 +0000 UTC m=+0.463221502 container init f79861152912162b48bc02f07313703f27573210c5e2578f39059d32cd7d539d (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d7bb9409-21ac-404c-881a-401a33317e0b, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 26 15:32:16 compute-1 podman[212729]: 2026-01-26 15:32:16.962626655 +0000 UTC m=+0.468374260 container start f79861152912162b48bc02f07313703f27573210c5e2578f39059d32cd7d539d (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d7bb9409-21ac-404c-881a-401a33317e0b, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 15:32:16 compute-1 neutron-haproxy-ovnmeta-d7bb9409-21ac-404c-881a-401a33317e0b[212759]: [NOTICE]   (212763) : New worker (212765) forked
Jan 26 15:32:16 compute-1 neutron-haproxy-ovnmeta-d7bb9409-21ac-404c-881a-401a33317e0b[212759]: [NOTICE]   (212763) : Loading success.
Jan 26 15:32:18 compute-1 nova_compute[183403]: 2026-01-26 15:32:18.084 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:32:19 compute-1 openstack_network_exporter[195610]: ERROR   15:32:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:32:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:32:19 compute-1 openstack_network_exporter[195610]: ERROR   15:32:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:32:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:32:20 compute-1 ovn_controller[95641]: 2026-01-26T15:32:20Z|00188|binding|INFO|Claiming lport 058045bb-dfca-4150-8a79-85fb7fad72ee for this chassis.
Jan 26 15:32:20 compute-1 ovn_controller[95641]: 2026-01-26T15:32:20Z|00189|binding|INFO|058045bb-dfca-4150-8a79-85fb7fad72ee: Claiming fa:16:3e:8a:02:63 10.100.0.5
Jan 26 15:32:20 compute-1 ovn_controller[95641]: 2026-01-26T15:32:20Z|00190|binding|INFO|Setting lport 058045bb-dfca-4150-8a79-85fb7fad72ee up in Southbound
Jan 26 15:32:20 compute-1 nova_compute[183403]: 2026-01-26 15:32:20.915 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:32:21 compute-1 nova_compute[183403]: 2026-01-26 15:32:21.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:32:21 compute-1 nova_compute[183403]: 2026-01-26 15:32:21.676 183407 INFO nova.compute.manager [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 0eb49996-7b21-4728-a0c0-cf817cd788e6] Post operation of migration started
Jan 26 15:32:21 compute-1 nova_compute[183403]: 2026-01-26 15:32:21.677 183407 WARNING neutronclient.v2_0.client [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:32:21 compute-1 nova_compute[183403]: 2026-01-26 15:32:21.767 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:32:22 compute-1 nova_compute[183403]: 2026-01-26 15:32:22.085 183407 WARNING neutronclient.v2_0.client [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:32:22 compute-1 nova_compute[183403]: 2026-01-26 15:32:22.086 183407 WARNING neutronclient.v2_0.client [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:32:22 compute-1 nova_compute[183403]: 2026-01-26 15:32:22.221 183407 DEBUG oslo_concurrency.lockutils [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "refresh_cache-0eb49996-7b21-4728-a0c0-cf817cd788e6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 15:32:22 compute-1 nova_compute[183403]: 2026-01-26 15:32:22.221 183407 DEBUG oslo_concurrency.lockutils [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquired lock "refresh_cache-0eb49996-7b21-4728-a0c0-cf817cd788e6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 15:32:22 compute-1 nova_compute[183403]: 2026-01-26 15:32:22.222 183407 DEBUG nova.network.neutron [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 0eb49996-7b21-4728-a0c0-cf817cd788e6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 15:32:22 compute-1 nova_compute[183403]: 2026-01-26 15:32:22.731 183407 WARNING neutronclient.v2_0.client [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:32:23 compute-1 nova_compute[183403]: 2026-01-26 15:32:23.544 183407 WARNING neutronclient.v2_0.client [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:32:23 compute-1 nova_compute[183403]: 2026-01-26 15:32:23.702 183407 DEBUG nova.network.neutron [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 0eb49996-7b21-4728-a0c0-cf817cd788e6] Updating instance_info_cache with network_info: [{"id": "058045bb-dfca-4150-8a79-85fb7fad72ee", "address": "fa:16:3e:8a:02:63", "network": {"id": "d7bb9409-21ac-404c-881a-401a33317e0b", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1176937166-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce5a22c9e1b44c8688bb5ce1d0d3ef81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap058045bb-df", "ovs_interfaceid": "058045bb-dfca-4150-8a79-85fb7fad72ee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:32:24 compute-1 nova_compute[183403]: 2026-01-26 15:32:24.208 183407 DEBUG oslo_concurrency.lockutils [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Releasing lock "refresh_cache-0eb49996-7b21-4728-a0c0-cf817cd788e6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 15:32:24 compute-1 nova_compute[183403]: 2026-01-26 15:32:24.734 183407 DEBUG oslo_concurrency.lockutils [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:32:24 compute-1 nova_compute[183403]: 2026-01-26 15:32:24.735 183407 DEBUG oslo_concurrency.lockutils [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:32:24 compute-1 nova_compute[183403]: 2026-01-26 15:32:24.736 183407 DEBUG oslo_concurrency.lockutils [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:32:24 compute-1 nova_compute[183403]: 2026-01-26 15:32:24.743 183407 INFO nova.virt.libvirt.driver [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 0eb49996-7b21-4728-a0c0-cf817cd788e6] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Jan 26 15:32:24 compute-1 virtqemud[183290]: Domain id=17 name='instance-00000018' uuid=0eb49996-7b21-4728-a0c0-cf817cd788e6 is tainted: custom-monitor
Jan 26 15:32:25 compute-1 nova_compute[183403]: 2026-01-26 15:32:25.753 183407 INFO nova.virt.libvirt.driver [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 0eb49996-7b21-4728-a0c0-cf817cd788e6] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Jan 26 15:32:25 compute-1 nova_compute[183403]: 2026-01-26 15:32:25.918 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:32:26 compute-1 nova_compute[183403]: 2026-01-26 15:32:26.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:32:26 compute-1 nova_compute[183403]: 2026-01-26 15:32:26.577 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:32:26 compute-1 nova_compute[183403]: 2026-01-26 15:32:26.758 183407 INFO nova.virt.libvirt.driver [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 0eb49996-7b21-4728-a0c0-cf817cd788e6] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Jan 26 15:32:26 compute-1 nova_compute[183403]: 2026-01-26 15:32:26.765 183407 DEBUG nova.compute.manager [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 0eb49996-7b21-4728-a0c0-cf817cd788e6] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Jan 26 15:32:26 compute-1 nova_compute[183403]: 2026-01-26 15:32:26.770 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:32:26 compute-1 podman[212774]: 2026-01-26 15:32:26.922133072 +0000 UTC m=+0.089387415 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 26 15:32:26 compute-1 podman[212775]: 2026-01-26 15:32:26.930745954 +0000 UTC m=+0.098459560 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, container_name=openstack_network_exporter, architecture=x86_64, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., config_id=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, release=1755695350, io.openshift.expose-services=, name=ubi9-minimal, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 26 15:32:27 compute-1 nova_compute[183403]: 2026-01-26 15:32:27.089 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:32:27 compute-1 nova_compute[183403]: 2026-01-26 15:32:27.090 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:32:27 compute-1 nova_compute[183403]: 2026-01-26 15:32:27.090 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:32:27 compute-1 nova_compute[183403]: 2026-01-26 15:32:27.090 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:32:27 compute-1 nova_compute[183403]: 2026-01-26 15:32:27.280 183407 DEBUG nova.objects.instance [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 0eb49996-7b21-4728-a0c0-cf817cd788e6] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Jan 26 15:32:28 compute-1 nova_compute[183403]: 2026-01-26 15:32:28.135 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0eb49996-7b21-4728-a0c0-cf817cd788e6/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:32:28 compute-1 nova_compute[183403]: 2026-01-26 15:32:28.193 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0eb49996-7b21-4728-a0c0-cf817cd788e6/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:32:28 compute-1 nova_compute[183403]: 2026-01-26 15:32:28.195 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0eb49996-7b21-4728-a0c0-cf817cd788e6/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:32:28 compute-1 nova_compute[183403]: 2026-01-26 15:32:28.279 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0eb49996-7b21-4728-a0c0-cf817cd788e6/disk --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:32:28 compute-1 nova_compute[183403]: 2026-01-26 15:32:28.430 183407 WARNING neutronclient.v2_0.client [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:32:28 compute-1 nova_compute[183403]: 2026-01-26 15:32:28.502 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:32:28 compute-1 nova_compute[183403]: 2026-01-26 15:32:28.503 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:32:28 compute-1 nova_compute[183403]: 2026-01-26 15:32:28.533 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.029s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:32:28 compute-1 nova_compute[183403]: 2026-01-26 15:32:28.533 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5613MB free_disk=73.11590576171875GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:32:28 compute-1 nova_compute[183403]: 2026-01-26 15:32:28.533 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:32:28 compute-1 nova_compute[183403]: 2026-01-26 15:32:28.534 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:32:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:32:29.084 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:32:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:32:29.084 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:32:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:32:29.085 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:32:29 compute-1 nova_compute[183403]: 2026-01-26 15:32:29.208 183407 WARNING neutronclient.v2_0.client [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:32:29 compute-1 nova_compute[183403]: 2026-01-26 15:32:29.209 183407 WARNING neutronclient.v2_0.client [None req-85365151-0712-4bfa-8714-de234366692d a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:32:29 compute-1 nova_compute[183403]: 2026-01-26 15:32:29.555 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Applying migration context for instance 0eb49996-7b21-4728-a0c0-cf817cd788e6 as it has an incoming, in-progress migration cb6b67a9-360f-4d40-928e-c77531baafd0. Migration status is running _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1046
Jan 26 15:32:29 compute-1 nova_compute[183403]: 2026-01-26 15:32:29.556 183407 DEBUG nova.objects.instance [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] [instance: 0eb49996-7b21-4728-a0c0-cf817cd788e6] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Jan 26 15:32:30 compute-1 nova_compute[183403]: 2026-01-26 15:32:30.064 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] [instance: 0eb49996-7b21-4728-a0c0-cf817cd788e6] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Jan 26 15:32:30 compute-1 nova_compute[183403]: 2026-01-26 15:32:30.353 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Instance 0eb49996-7b21-4728-a0c0-cf817cd788e6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 1151, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 15:32:30 compute-1 nova_compute[183403]: 2026-01-26 15:32:30.353 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:32:30 compute-1 nova_compute[183403]: 2026-01-26 15:32:30.354 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=1663MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:32:28 up  1:27,  0 user,  load average: 0.08, 0.14, 0.22\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_2c23d857cca949afb2559c9276298f2f': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:32:30 compute-1 nova_compute[183403]: 2026-01-26 15:32:30.427 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Refreshing inventories for resource provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Jan 26 15:32:30 compute-1 nova_compute[183403]: 2026-01-26 15:32:30.474 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Updating ProviderTree inventory for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Jan 26 15:32:30 compute-1 nova_compute[183403]: 2026-01-26 15:32:30.475 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Updating inventory in ProviderTree for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Jan 26 15:32:30 compute-1 nova_compute[183403]: 2026-01-26 15:32:30.487 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Refreshing aggregate associations for resource provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Jan 26 15:32:30 compute-1 nova_compute[183403]: 2026-01-26 15:32:30.506 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Refreshing trait associations for resource provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67, traits: COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE41,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NODE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_MMX,COMPUTE_ACCELERATORS,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_ARCH_X86_64,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE2,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_SOUND_MODEL_USB,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_TIS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_DEVICE_TAGGING,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_CRB,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_SOUND_MODEL_SB16,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_QCOW2,HW_ARCH_X86_64,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SECURITY_TPM_2_0 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Jan 26 15:32:30 compute-1 nova_compute[183403]: 2026-01-26 15:32:30.556 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:32:30 compute-1 nova_compute[183403]: 2026-01-26 15:32:30.919 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:32:31 compute-1 nova_compute[183403]: 2026-01-26 15:32:31.066 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:32:31 compute-1 nova_compute[183403]: 2026-01-26 15:32:31.584 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:32:31 compute-1 nova_compute[183403]: 2026-01-26 15:32:31.585 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.051s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:32:31 compute-1 nova_compute[183403]: 2026-01-26 15:32:31.783 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:32:32 compute-1 nova_compute[183403]: 2026-01-26 15:32:32.585 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:32:32 compute-1 nova_compute[183403]: 2026-01-26 15:32:32.585 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:32:33 compute-1 nova_compute[183403]: 2026-01-26 15:32:33.099 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:32:33 compute-1 nova_compute[183403]: 2026-01-26 15:32:33.099 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:32:33 compute-1 nova_compute[183403]: 2026-01-26 15:32:33.100 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:32:33 compute-1 nova_compute[183403]: 2026-01-26 15:32:33.100 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 15:32:35 compute-1 podman[192725]: time="2026-01-26T15:32:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:32:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:32:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16575 "" "Go-http-client/1.1"
Jan 26 15:32:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:32:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2659 "" "Go-http-client/1.1"
Jan 26 15:32:35 compute-1 nova_compute[183403]: 2026-01-26 15:32:35.921 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:32:36 compute-1 nova_compute[183403]: 2026-01-26 15:32:36.831 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:32:37 compute-1 podman[212827]: 2026-01-26 15:32:37.924742496 +0000 UTC m=+0.083791295 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 26 15:32:37 compute-1 podman[212826]: 2026-01-26 15:32:37.972830299 +0000 UTC m=+0.136704908 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 26 15:32:40 compute-1 nova_compute[183403]: 2026-01-26 15:32:40.924 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:32:41 compute-1 nova_compute[183403]: 2026-01-26 15:32:41.833 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:32:41 compute-1 nova_compute[183403]: 2026-01-26 15:32:41.904 183407 DEBUG oslo_concurrency.lockutils [None req-48e6341b-468c-478d-a48c-3b6907e2c299 d0d771a34e2643d782edb3717de7f449 2c23d857cca949afb2559c9276298f2f - - default default] Acquiring lock "0eb49996-7b21-4728-a0c0-cf817cd788e6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:32:41 compute-1 nova_compute[183403]: 2026-01-26 15:32:41.905 183407 DEBUG oslo_concurrency.lockutils [None req-48e6341b-468c-478d-a48c-3b6907e2c299 d0d771a34e2643d782edb3717de7f449 2c23d857cca949afb2559c9276298f2f - - default default] Lock "0eb49996-7b21-4728-a0c0-cf817cd788e6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:32:41 compute-1 nova_compute[183403]: 2026-01-26 15:32:41.905 183407 DEBUG oslo_concurrency.lockutils [None req-48e6341b-468c-478d-a48c-3b6907e2c299 d0d771a34e2643d782edb3717de7f449 2c23d857cca949afb2559c9276298f2f - - default default] Acquiring lock "0eb49996-7b21-4728-a0c0-cf817cd788e6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:32:41 compute-1 nova_compute[183403]: 2026-01-26 15:32:41.905 183407 DEBUG oslo_concurrency.lockutils [None req-48e6341b-468c-478d-a48c-3b6907e2c299 d0d771a34e2643d782edb3717de7f449 2c23d857cca949afb2559c9276298f2f - - default default] Lock "0eb49996-7b21-4728-a0c0-cf817cd788e6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:32:41 compute-1 nova_compute[183403]: 2026-01-26 15:32:41.905 183407 DEBUG oslo_concurrency.lockutils [None req-48e6341b-468c-478d-a48c-3b6907e2c299 d0d771a34e2643d782edb3717de7f449 2c23d857cca949afb2559c9276298f2f - - default default] Lock "0eb49996-7b21-4728-a0c0-cf817cd788e6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:32:41 compute-1 nova_compute[183403]: 2026-01-26 15:32:41.923 183407 INFO nova.compute.manager [None req-48e6341b-468c-478d-a48c-3b6907e2c299 d0d771a34e2643d782edb3717de7f449 2c23d857cca949afb2559c9276298f2f - - default default] [instance: 0eb49996-7b21-4728-a0c0-cf817cd788e6] Terminating instance
Jan 26 15:32:42 compute-1 nova_compute[183403]: 2026-01-26 15:32:42.442 183407 DEBUG nova.compute.manager [None req-48e6341b-468c-478d-a48c-3b6907e2c299 d0d771a34e2643d782edb3717de7f449 2c23d857cca949afb2559c9276298f2f - - default default] [instance: 0eb49996-7b21-4728-a0c0-cf817cd788e6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Jan 26 15:32:42 compute-1 kernel: tap058045bb-df (unregistering): left promiscuous mode
Jan 26 15:32:42 compute-1 NetworkManager[55716]: <info>  [1769441562.7385] device (tap058045bb-df): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 15:32:42 compute-1 ovn_controller[95641]: 2026-01-26T15:32:42Z|00191|binding|INFO|Releasing lport 058045bb-dfca-4150-8a79-85fb7fad72ee from this chassis (sb_readonly=0)
Jan 26 15:32:42 compute-1 ovn_controller[95641]: 2026-01-26T15:32:42Z|00192|binding|INFO|Setting lport 058045bb-dfca-4150-8a79-85fb7fad72ee down in Southbound
Jan 26 15:32:42 compute-1 nova_compute[183403]: 2026-01-26 15:32:42.759 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:32:42 compute-1 ovn_controller[95641]: 2026-01-26T15:32:42Z|00193|binding|INFO|Removing iface tap058045bb-df ovn-installed in OVS
Jan 26 15:32:42 compute-1 nova_compute[183403]: 2026-01-26 15:32:42.761 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:32:42 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:32:42.770 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8a:02:63 10.100.0.5'], port_security=['fa:16:3e:8a:02:63 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '0eb49996-7b21-4728-a0c0-cf817cd788e6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7bb9409-21ac-404c-881a-401a33317e0b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2c23d857cca949afb2559c9276298f2f', 'neutron:revision_number': '15', 'neutron:security_group_ids': '2816b4e0-6d42-4df2-a497-a52d8e0e90c8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2fe2c46a-3344-4b49-9cc4-4db510e2e673, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], logical_port=058045bb-dfca-4150-8a79-85fb7fad72ee) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:32:42 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:32:42.772 104930 INFO neutron.agent.ovn.metadata.agent [-] Port 058045bb-dfca-4150-8a79-85fb7fad72ee in datapath d7bb9409-21ac-404c-881a-401a33317e0b unbound from our chassis
Jan 26 15:32:42 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:32:42.774 104930 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d7bb9409-21ac-404c-881a-401a33317e0b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 15:32:42 compute-1 nova_compute[183403]: 2026-01-26 15:32:42.777 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:32:42 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:32:42.777 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[f7e042b7-e988-4c50-a52e-343799eee914]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:32:42 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:32:42.779 104930 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d7bb9409-21ac-404c-881a-401a33317e0b namespace which is not needed anymore
Jan 26 15:32:42 compute-1 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000018.scope: Deactivated successfully.
Jan 26 15:32:42 compute-1 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000018.scope: Consumed 2.437s CPU time.
Jan 26 15:32:42 compute-1 systemd-machined[154697]: Machine qemu-17-instance-00000018 terminated.
Jan 26 15:32:42 compute-1 nova_compute[183403]: 2026-01-26 15:32:42.934 183407 INFO nova.virt.libvirt.driver [-] [instance: 0eb49996-7b21-4728-a0c0-cf817cd788e6] Instance destroyed successfully.
Jan 26 15:32:42 compute-1 nova_compute[183403]: 2026-01-26 15:32:42.934 183407 DEBUG nova.objects.instance [None req-48e6341b-468c-478d-a48c-3b6907e2c299 d0d771a34e2643d782edb3717de7f449 2c23d857cca949afb2559c9276298f2f - - default default] Lazy-loading 'resources' on Instance uuid 0eb49996-7b21-4728-a0c0-cf817cd788e6 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:32:42 compute-1 neutron-haproxy-ovnmeta-d7bb9409-21ac-404c-881a-401a33317e0b[212759]: [NOTICE]   (212763) : haproxy version is 3.0.5-8e879a5
Jan 26 15:32:42 compute-1 neutron-haproxy-ovnmeta-d7bb9409-21ac-404c-881a-401a33317e0b[212759]: [NOTICE]   (212763) : path to executable is /usr/sbin/haproxy
Jan 26 15:32:42 compute-1 neutron-haproxy-ovnmeta-d7bb9409-21ac-404c-881a-401a33317e0b[212759]: [WARNING]  (212763) : Exiting Master process...
Jan 26 15:32:42 compute-1 podman[212904]: 2026-01-26 15:32:42.962265758 +0000 UTC m=+0.046433820 container kill f79861152912162b48bc02f07313703f27573210c5e2578f39059d32cd7d539d (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d7bb9409-21ac-404c-881a-401a33317e0b, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 26 15:32:42 compute-1 neutron-haproxy-ovnmeta-d7bb9409-21ac-404c-881a-401a33317e0b[212759]: [ALERT]    (212763) : Current worker (212765) exited with code 143 (Terminated)
Jan 26 15:32:42 compute-1 neutron-haproxy-ovnmeta-d7bb9409-21ac-404c-881a-401a33317e0b[212759]: [WARNING]  (212763) : All workers exited. Exiting... (0)
Jan 26 15:32:42 compute-1 systemd[1]: libpod-f79861152912162b48bc02f07313703f27573210c5e2578f39059d32cd7d539d.scope: Deactivated successfully.
Jan 26 15:32:43 compute-1 nova_compute[183403]: 2026-01-26 15:32:43.359 183407 DEBUG nova.compute.manager [req-6fc06494-f6e7-4c67-8008-4fc66b0916b3 req-b4247405-5867-4daf-b019-06c520c8c53b 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 0eb49996-7b21-4728-a0c0-cf817cd788e6] Received event network-vif-unplugged-058045bb-dfca-4150-8a79-85fb7fad72ee external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:32:43 compute-1 nova_compute[183403]: 2026-01-26 15:32:43.359 183407 DEBUG oslo_concurrency.lockutils [req-6fc06494-f6e7-4c67-8008-4fc66b0916b3 req-b4247405-5867-4daf-b019-06c520c8c53b 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "0eb49996-7b21-4728-a0c0-cf817cd788e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:32:43 compute-1 nova_compute[183403]: 2026-01-26 15:32:43.360 183407 DEBUG oslo_concurrency.lockutils [req-6fc06494-f6e7-4c67-8008-4fc66b0916b3 req-b4247405-5867-4daf-b019-06c520c8c53b 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "0eb49996-7b21-4728-a0c0-cf817cd788e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:32:43 compute-1 nova_compute[183403]: 2026-01-26 15:32:43.360 183407 DEBUG oslo_concurrency.lockutils [req-6fc06494-f6e7-4c67-8008-4fc66b0916b3 req-b4247405-5867-4daf-b019-06c520c8c53b 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "0eb49996-7b21-4728-a0c0-cf817cd788e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:32:43 compute-1 nova_compute[183403]: 2026-01-26 15:32:43.361 183407 DEBUG nova.compute.manager [req-6fc06494-f6e7-4c67-8008-4fc66b0916b3 req-b4247405-5867-4daf-b019-06c520c8c53b 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 0eb49996-7b21-4728-a0c0-cf817cd788e6] No waiting events found dispatching network-vif-unplugged-058045bb-dfca-4150-8a79-85fb7fad72ee pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:32:43 compute-1 nova_compute[183403]: 2026-01-26 15:32:43.361 183407 DEBUG nova.compute.manager [req-6fc06494-f6e7-4c67-8008-4fc66b0916b3 req-b4247405-5867-4daf-b019-06c520c8c53b 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 0eb49996-7b21-4728-a0c0-cf817cd788e6] Received event network-vif-unplugged-058045bb-dfca-4150-8a79-85fb7fad72ee for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 15:32:43 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:32:43.417 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=29, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:ac:38', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'de:96:40:90:f3:e5'}, ipsec=False) old=SB_Global(nb_cfg=28) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:32:43 compute-1 nova_compute[183403]: 2026-01-26 15:32:43.418 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:32:43 compute-1 nova_compute[183403]: 2026-01-26 15:32:43.441 183407 DEBUG nova.virt.libvirt.vif [None req-48e6341b-468c-478d-a48c-3b6907e2c299 d0d771a34e2643d782edb3717de7f449 2c23d857cca949afb2559c9276298f2f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2026-01-26T15:31:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-1497496386',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(3),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-1497496386',id=24,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=3,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T15:31:27Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1151,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2c23d857cca949afb2559c9276298f2f',ramdisk_id='',reservation_id='r-poy1adt1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',clean_attempts='1',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-418427150',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-418427150-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T15:32:27Z,user_data=None,user_id='d0d771a34e2643d782edb3717de7f449',uuid=0eb49996-7b21-4728-a0c0-cf817cd788e6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "058045bb-dfca-4150-8a79-85fb7fad72ee", "address": "fa:16:3e:8a:02:63", "network": {"id": "d7bb9409-21ac-404c-881a-401a33317e0b", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1176937166-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce5a22c9e1b44c8688bb5ce1d0d3ef81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap058045bb-df", "ovs_interfaceid": "058045bb-dfca-4150-8a79-85fb7fad72ee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 26 15:32:43 compute-1 nova_compute[183403]: 2026-01-26 15:32:43.442 183407 DEBUG nova.network.os_vif_util [None req-48e6341b-468c-478d-a48c-3b6907e2c299 d0d771a34e2643d782edb3717de7f449 2c23d857cca949afb2559c9276298f2f - - default default] Converting VIF {"id": "058045bb-dfca-4150-8a79-85fb7fad72ee", "address": "fa:16:3e:8a:02:63", "network": {"id": "d7bb9409-21ac-404c-881a-401a33317e0b", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1176937166-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce5a22c9e1b44c8688bb5ce1d0d3ef81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap058045bb-df", "ovs_interfaceid": "058045bb-dfca-4150-8a79-85fb7fad72ee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:32:43 compute-1 nova_compute[183403]: 2026-01-26 15:32:43.445 183407 DEBUG nova.network.os_vif_util [None req-48e6341b-468c-478d-a48c-3b6907e2c299 d0d771a34e2643d782edb3717de7f449 2c23d857cca949afb2559c9276298f2f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8a:02:63,bridge_name='br-int',has_traffic_filtering=True,id=058045bb-dfca-4150-8a79-85fb7fad72ee,network=Network(d7bb9409-21ac-404c-881a-401a33317e0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap058045bb-df') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:32:43 compute-1 nova_compute[183403]: 2026-01-26 15:32:43.446 183407 DEBUG os_vif [None req-48e6341b-468c-478d-a48c-3b6907e2c299 d0d771a34e2643d782edb3717de7f449 2c23d857cca949afb2559c9276298f2f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:8a:02:63,bridge_name='br-int',has_traffic_filtering=True,id=058045bb-dfca-4150-8a79-85fb7fad72ee,network=Network(d7bb9409-21ac-404c-881a-401a33317e0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap058045bb-df') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 26 15:32:43 compute-1 nova_compute[183403]: 2026-01-26 15:32:43.449 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:32:43 compute-1 nova_compute[183403]: 2026-01-26 15:32:43.451 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap058045bb-df, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:32:43 compute-1 nova_compute[183403]: 2026-01-26 15:32:43.453 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:32:43 compute-1 nova_compute[183403]: 2026-01-26 15:32:43.455 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:32:43 compute-1 nova_compute[183403]: 2026-01-26 15:32:43.456 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:32:43 compute-1 nova_compute[183403]: 2026-01-26 15:32:43.458 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:32:43 compute-1 nova_compute[183403]: 2026-01-26 15:32:43.458 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=00ee4606-6741-40b7-adb5-43f87022b533) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:32:43 compute-1 nova_compute[183403]: 2026-01-26 15:32:43.459 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:32:43 compute-1 nova_compute[183403]: 2026-01-26 15:32:43.464 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:32:43 compute-1 nova_compute[183403]: 2026-01-26 15:32:43.465 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:32:43 compute-1 podman[212925]: 2026-01-26 15:32:43.46906374 +0000 UTC m=+0.476519059 container died f79861152912162b48bc02f07313703f27573210c5e2578f39059d32cd7d539d (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d7bb9409-21ac-404c-881a-401a33317e0b, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260120, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 26 15:32:43 compute-1 nova_compute[183403]: 2026-01-26 15:32:43.474 183407 INFO os_vif [None req-48e6341b-468c-478d-a48c-3b6907e2c299 d0d771a34e2643d782edb3717de7f449 2c23d857cca949afb2559c9276298f2f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:8a:02:63,bridge_name='br-int',has_traffic_filtering=True,id=058045bb-dfca-4150-8a79-85fb7fad72ee,network=Network(d7bb9409-21ac-404c-881a-401a33317e0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap058045bb-df')
Jan 26 15:32:43 compute-1 nova_compute[183403]: 2026-01-26 15:32:43.475 183407 INFO nova.virt.libvirt.driver [None req-48e6341b-468c-478d-a48c-3b6907e2c299 d0d771a34e2643d782edb3717de7f449 2c23d857cca949afb2559c9276298f2f - - default default] [instance: 0eb49996-7b21-4728-a0c0-cf817cd788e6] Deleting instance files /var/lib/nova/instances/0eb49996-7b21-4728-a0c0-cf817cd788e6_del
Jan 26 15:32:43 compute-1 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 15:32:43 compute-1 nova_compute[183403]: 2026-01-26 15:32:43.476 183407 INFO nova.virt.libvirt.driver [None req-48e6341b-468c-478d-a48c-3b6907e2c299 d0d771a34e2643d782edb3717de7f449 2c23d857cca949afb2559c9276298f2f - - default default] [instance: 0eb49996-7b21-4728-a0c0-cf817cd788e6] Deletion of /var/lib/nova/instances/0eb49996-7b21-4728-a0c0-cf817cd788e6_del complete
Jan 26 15:32:43 compute-1 nova_compute[183403]: 2026-01-26 15:32:43.990 183407 INFO nova.compute.manager [None req-48e6341b-468c-478d-a48c-3b6907e2c299 d0d771a34e2643d782edb3717de7f449 2c23d857cca949afb2559c9276298f2f - - default default] [instance: 0eb49996-7b21-4728-a0c0-cf817cd788e6] Took 1.55 seconds to destroy the instance on the hypervisor.
Jan 26 15:32:43 compute-1 nova_compute[183403]: 2026-01-26 15:32:43.991 183407 DEBUG oslo.service.backend._eventlet.loopingcall [None req-48e6341b-468c-478d-a48c-3b6907e2c299 d0d771a34e2643d782edb3717de7f449 2c23d857cca949afb2559c9276298f2f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Jan 26 15:32:43 compute-1 nova_compute[183403]: 2026-01-26 15:32:43.991 183407 DEBUG nova.compute.manager [-] [instance: 0eb49996-7b21-4728-a0c0-cf817cd788e6] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Jan 26 15:32:43 compute-1 nova_compute[183403]: 2026-01-26 15:32:43.991 183407 DEBUG nova.network.neutron [-] [instance: 0eb49996-7b21-4728-a0c0-cf817cd788e6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Jan 26 15:32:43 compute-1 nova_compute[183403]: 2026-01-26 15:32:43.992 183407 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:32:44 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f79861152912162b48bc02f07313703f27573210c5e2578f39059d32cd7d539d-userdata-shm.mount: Deactivated successfully.
Jan 26 15:32:44 compute-1 systemd[1]: var-lib-containers-storage-overlay-54ffde7b6a96ab9aab23155087a2fd768bf77ee1623ad5aa6ad3ddb5814dbcfd-merged.mount: Deactivated successfully.
Jan 26 15:32:44 compute-1 podman[212925]: 2026-01-26 15:32:44.044618332 +0000 UTC m=+1.052073621 container cleanup f79861152912162b48bc02f07313703f27573210c5e2578f39059d32cd7d539d (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d7bb9409-21ac-404c-881a-401a33317e0b, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260120, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Jan 26 15:32:44 compute-1 systemd[1]: libpod-conmon-f79861152912162b48bc02f07313703f27573210c5e2578f39059d32cd7d539d.scope: Deactivated successfully.
Jan 26 15:32:44 compute-1 podman[212939]: 2026-01-26 15:32:44.119459105 +0000 UTC m=+0.632578777 container remove f79861152912162b48bc02f07313703f27573210c5e2578f39059d32cd7d539d (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d7bb9409-21ac-404c-881a-401a33317e0b, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 26 15:32:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:32:44.127 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[5cda3a4b-98ff-455e-bbed-062b2d609cea]: (4, ("Mon Jan 26 03:32:42 PM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-d7bb9409-21ac-404c-881a-401a33317e0b (f79861152912162b48bc02f07313703f27573210c5e2578f39059d32cd7d539d)\nf79861152912162b48bc02f07313703f27573210c5e2578f39059d32cd7d539d\nMon Jan 26 03:32:43 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-d7bb9409-21ac-404c-881a-401a33317e0b (f79861152912162b48bc02f07313703f27573210c5e2578f39059d32cd7d539d)\nf79861152912162b48bc02f07313703f27573210c5e2578f39059d32cd7d539d\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:32:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:32:44.129 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[0007b03f-ab6f-43e9-b135-7a0389879937]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:32:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:32:44.131 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d7bb9409-21ac-404c-881a-401a33317e0b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d7bb9409-21ac-404c-881a-401a33317e0b.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:32:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:32:44.132 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[64239589-5962-48c7-bf44-d57d66425be2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:32:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:32:44.133 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7bb9409-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:32:44 compute-1 kernel: tapd7bb9409-20: left promiscuous mode
Jan 26 15:32:44 compute-1 nova_compute[183403]: 2026-01-26 15:32:44.138 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:32:44 compute-1 nova_compute[183403]: 2026-01-26 15:32:44.161 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:32:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:32:44.165 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[3cca63e3-47a4-464c-a926-3ed37b2d274a]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:32:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:32:44.186 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[993d3bce-aa6a-4d27-ac12-2f41736d4ec9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:32:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:32:44.188 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[5545fb61-bc4a-44a8-9fd9-0235c35a4c81]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:32:44 compute-1 nova_compute[183403]: 2026-01-26 15:32:44.217 183407 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:32:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:32:44.218 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[e194f353-fca7-479e-96b1-09fbbf9ef07a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 525842, 'reachable_time': 35394, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212959, 'error': None, 'target': 'ovnmeta-d7bb9409-21ac-404c-881a-401a33317e0b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:32:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:32:44.221 105448 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d7bb9409-21ac-404c-881a-401a33317e0b deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Jan 26 15:32:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:32:44.221 105448 DEBUG oslo.privsep.daemon [-] privsep: reply[44c080ed-8b3f-4262-82c3-0a3b9187bd86]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:32:44 compute-1 systemd[1]: run-netns-ovnmeta\x2dd7bb9409\x2d21ac\x2d404c\x2d881a\x2d401a33317e0b.mount: Deactivated successfully.
Jan 26 15:32:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:32:44.223 104930 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 15:32:44 compute-1 nova_compute[183403]: 2026-01-26 15:32:44.541 183407 DEBUG nova.compute.manager [req-0e25b693-cccb-4c70-820e-3cb52187eb23 req-db2434c9-306c-4a34-8aed-12e70e950fb4 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 0eb49996-7b21-4728-a0c0-cf817cd788e6] Received event network-vif-deleted-058045bb-dfca-4150-8a79-85fb7fad72ee external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:32:44 compute-1 nova_compute[183403]: 2026-01-26 15:32:44.542 183407 INFO nova.compute.manager [req-0e25b693-cccb-4c70-820e-3cb52187eb23 req-db2434c9-306c-4a34-8aed-12e70e950fb4 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 0eb49996-7b21-4728-a0c0-cf817cd788e6] Neutron deleted interface 058045bb-dfca-4150-8a79-85fb7fad72ee; detaching it from the instance and deleting it from the info cache
Jan 26 15:32:44 compute-1 nova_compute[183403]: 2026-01-26 15:32:44.542 183407 DEBUG nova.network.neutron [req-0e25b693-cccb-4c70-820e-3cb52187eb23 req-db2434c9-306c-4a34-8aed-12e70e950fb4 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 0eb49996-7b21-4728-a0c0-cf817cd788e6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:32:45 compute-1 nova_compute[183403]: 2026-01-26 15:32:45.002 183407 DEBUG nova.network.neutron [-] [instance: 0eb49996-7b21-4728-a0c0-cf817cd788e6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:32:45 compute-1 nova_compute[183403]: 2026-01-26 15:32:45.051 183407 DEBUG nova.compute.manager [req-0e25b693-cccb-4c70-820e-3cb52187eb23 req-db2434c9-306c-4a34-8aed-12e70e950fb4 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 0eb49996-7b21-4728-a0c0-cf817cd788e6] Detach interface failed, port_id=058045bb-dfca-4150-8a79-85fb7fad72ee, reason: Instance 0eb49996-7b21-4728-a0c0-cf817cd788e6 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Jan 26 15:32:45 compute-1 nova_compute[183403]: 2026-01-26 15:32:45.429 183407 DEBUG nova.compute.manager [req-fc12cdf8-8252-44f5-bccf-00a63ce40a1c req-7735c571-1696-470e-8b8f-70f704dbb43c 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 0eb49996-7b21-4728-a0c0-cf817cd788e6] Received event network-vif-unplugged-058045bb-dfca-4150-8a79-85fb7fad72ee external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:32:45 compute-1 nova_compute[183403]: 2026-01-26 15:32:45.430 183407 DEBUG oslo_concurrency.lockutils [req-fc12cdf8-8252-44f5-bccf-00a63ce40a1c req-7735c571-1696-470e-8b8f-70f704dbb43c 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "0eb49996-7b21-4728-a0c0-cf817cd788e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:32:45 compute-1 nova_compute[183403]: 2026-01-26 15:32:45.430 183407 DEBUG oslo_concurrency.lockutils [req-fc12cdf8-8252-44f5-bccf-00a63ce40a1c req-7735c571-1696-470e-8b8f-70f704dbb43c 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "0eb49996-7b21-4728-a0c0-cf817cd788e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:32:45 compute-1 nova_compute[183403]: 2026-01-26 15:32:45.431 183407 DEBUG oslo_concurrency.lockutils [req-fc12cdf8-8252-44f5-bccf-00a63ce40a1c req-7735c571-1696-470e-8b8f-70f704dbb43c 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "0eb49996-7b21-4728-a0c0-cf817cd788e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:32:45 compute-1 nova_compute[183403]: 2026-01-26 15:32:45.431 183407 DEBUG nova.compute.manager [req-fc12cdf8-8252-44f5-bccf-00a63ce40a1c req-7735c571-1696-470e-8b8f-70f704dbb43c 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 0eb49996-7b21-4728-a0c0-cf817cd788e6] No waiting events found dispatching network-vif-unplugged-058045bb-dfca-4150-8a79-85fb7fad72ee pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:32:45 compute-1 nova_compute[183403]: 2026-01-26 15:32:45.431 183407 DEBUG nova.compute.manager [req-fc12cdf8-8252-44f5-bccf-00a63ce40a1c req-7735c571-1696-470e-8b8f-70f704dbb43c 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 0eb49996-7b21-4728-a0c0-cf817cd788e6] Received event network-vif-unplugged-058045bb-dfca-4150-8a79-85fb7fad72ee for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 15:32:45 compute-1 nova_compute[183403]: 2026-01-26 15:32:45.514 183407 INFO nova.compute.manager [-] [instance: 0eb49996-7b21-4728-a0c0-cf817cd788e6] Took 1.52 seconds to deallocate network for instance.
Jan 26 15:32:46 compute-1 nova_compute[183403]: 2026-01-26 15:32:46.043 183407 DEBUG oslo_concurrency.lockutils [None req-48e6341b-468c-478d-a48c-3b6907e2c299 d0d771a34e2643d782edb3717de7f449 2c23d857cca949afb2559c9276298f2f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:32:46 compute-1 nova_compute[183403]: 2026-01-26 15:32:46.044 183407 DEBUG oslo_concurrency.lockutils [None req-48e6341b-468c-478d-a48c-3b6907e2c299 d0d771a34e2643d782edb3717de7f449 2c23d857cca949afb2559c9276298f2f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:32:46 compute-1 nova_compute[183403]: 2026-01-26 15:32:46.094 183407 DEBUG nova.compute.provider_tree [None req-48e6341b-468c-478d-a48c-3b6907e2c299 d0d771a34e2643d782edb3717de7f449 2c23d857cca949afb2559c9276298f2f - - default default] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:32:46 compute-1 nova_compute[183403]: 2026-01-26 15:32:46.604 183407 DEBUG nova.scheduler.client.report [None req-48e6341b-468c-478d-a48c-3b6907e2c299 d0d771a34e2643d782edb3717de7f449 2c23d857cca949afb2559c9276298f2f - - default default] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:32:46 compute-1 nova_compute[183403]: 2026-01-26 15:32:46.859 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:32:47 compute-1 nova_compute[183403]: 2026-01-26 15:32:47.113 183407 DEBUG oslo_concurrency.lockutils [None req-48e6341b-468c-478d-a48c-3b6907e2c299 d0d771a34e2643d782edb3717de7f449 2c23d857cca949afb2559c9276298f2f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.070s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:32:47 compute-1 nova_compute[183403]: 2026-01-26 15:32:47.148 183407 INFO nova.scheduler.client.report [None req-48e6341b-468c-478d-a48c-3b6907e2c299 d0d771a34e2643d782edb3717de7f449 2c23d857cca949afb2559c9276298f2f - - default default] Deleted allocations for instance 0eb49996-7b21-4728-a0c0-cf817cd788e6
Jan 26 15:32:48 compute-1 nova_compute[183403]: 2026-01-26 15:32:48.185 183407 DEBUG oslo_concurrency.lockutils [None req-48e6341b-468c-478d-a48c-3b6907e2c299 d0d771a34e2643d782edb3717de7f449 2c23d857cca949afb2559c9276298f2f - - default default] Lock "0eb49996-7b21-4728-a0c0-cf817cd788e6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.281s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:32:48 compute-1 nova_compute[183403]: 2026-01-26 15:32:48.460 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:32:49 compute-1 openstack_network_exporter[195610]: ERROR   15:32:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:32:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:32:49 compute-1 openstack_network_exporter[195610]: ERROR   15:32:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:32:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:32:50 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:32:50.224 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=41380b5a-e321-4ce4-bcc6-ecd563b3c793, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '29'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:32:51 compute-1 nova_compute[183403]: 2026-01-26 15:32:51.861 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:32:53 compute-1 nova_compute[183403]: 2026-01-26 15:32:53.462 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:32:53 compute-1 nova_compute[183403]: 2026-01-26 15:32:53.632 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:32:56 compute-1 nova_compute[183403]: 2026-01-26 15:32:56.863 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:32:57 compute-1 podman[212963]: 2026-01-26 15:32:57.944656853 +0000 UTC m=+0.109819654 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=ubi9-minimal, release=1755695350, config_id=openstack_network_exporter, vcs-type=git, container_name=openstack_network_exporter)
Jan 26 15:32:57 compute-1 podman[212962]: 2026-01-26 15:32:57.961381703 +0000 UTC m=+0.133116672 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 26 15:32:58 compute-1 nova_compute[183403]: 2026-01-26 15:32:58.464 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:33:01 compute-1 nova_compute[183403]: 2026-01-26 15:33:01.880 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:33:02 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:33:02.284 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:92:f7:3a 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc98e8b1-8169-4a08-8b22-cd8a87c017a0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a18d09bfe0e7479c8a237dd032889317', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=76661f47-b7c7-4131-9a1a-0f8828404115, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=219edf33-3765-4d7c-87b5-4ab0ed1d6a8a) old=Port_Binding(mac=['fa:16:3e:92:f7:3a'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc98e8b1-8169-4a08-8b22-cd8a87c017a0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a18d09bfe0e7479c8a237dd032889317', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:33:02 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:33:02.285 104930 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 219edf33-3765-4d7c-87b5-4ab0ed1d6a8a in datapath cc98e8b1-8169-4a08-8b22-cd8a87c017a0 updated
Jan 26 15:33:02 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:33:02.286 104930 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cc98e8b1-8169-4a08-8b22-cd8a87c017a0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 15:33:02 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:33:02.287 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[041776aa-c10c-45ed-a690-c578732703e0]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:33:03 compute-1 nova_compute[183403]: 2026-01-26 15:33:03.467 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:33:05 compute-1 podman[192725]: time="2026-01-26T15:33:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:33:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:33:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:33:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:33:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2193 "" "Go-http-client/1.1"
Jan 26 15:33:06 compute-1 nova_compute[183403]: 2026-01-26 15:33:06.882 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:33:08 compute-1 nova_compute[183403]: 2026-01-26 15:33:08.469 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:33:08 compute-1 podman[213008]: 2026-01-26 15:33:08.945938792 +0000 UTC m=+0.111055438 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 26 15:33:08 compute-1 podman[213007]: 2026-01-26 15:33:08.976902015 +0000 UTC m=+0.146369548 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260120, config_id=ovn_controller)
Jan 26 15:33:11 compute-1 nova_compute[183403]: 2026-01-26 15:33:11.885 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:33:12 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:33:12.537 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:47:1f:1a 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-933ab3ab-625d-4f5a-a82b-b03ecbbfaeea', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-933ab3ab-625d-4f5a-a82b-b03ecbbfaeea', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f7ddd8ab2ae4841a1f43ae8078bb924', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5d28ba69-3fea-48c3-bb80-87b492f06f1d, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=7b0c9de1-263b-41e2-91a4-541aafab1019) old=Port_Binding(mac=['fa:16:3e:47:1f:1a'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-933ab3ab-625d-4f5a-a82b-b03ecbbfaeea', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-933ab3ab-625d-4f5a-a82b-b03ecbbfaeea', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f7ddd8ab2ae4841a1f43ae8078bb924', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:33:12 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:33:12.541 104930 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 7b0c9de1-263b-41e2-91a4-541aafab1019 in datapath 933ab3ab-625d-4f5a-a82b-b03ecbbfaeea updated
Jan 26 15:33:12 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:33:12.542 104930 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 933ab3ab-625d-4f5a-a82b-b03ecbbfaeea, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 15:33:12 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:33:12.543 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[4824242c-ea3f-4d5f-94bb-b3f59ddd9657]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:33:13 compute-1 nova_compute[183403]: 2026-01-26 15:33:13.471 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:33:16 compute-1 nova_compute[183403]: 2026-01-26 15:33:16.886 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:33:18 compute-1 nova_compute[183403]: 2026-01-26 15:33:18.473 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:33:18 compute-1 nova_compute[183403]: 2026-01-26 15:33:18.577 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:33:19 compute-1 openstack_network_exporter[195610]: ERROR   15:33:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:33:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:33:19 compute-1 openstack_network_exporter[195610]: ERROR   15:33:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:33:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:33:21 compute-1 nova_compute[183403]: 2026-01-26 15:33:21.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:33:21 compute-1 nova_compute[183403]: 2026-01-26 15:33:21.887 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:33:23 compute-1 nova_compute[183403]: 2026-01-26 15:33:23.475 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:33:25 compute-1 ovn_controller[95641]: 2026-01-26T15:33:25Z|00194|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Jan 26 15:33:26 compute-1 nova_compute[183403]: 2026-01-26 15:33:26.577 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:33:26 compute-1 nova_compute[183403]: 2026-01-26 15:33:26.893 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:33:27 compute-1 nova_compute[183403]: 2026-01-26 15:33:27.093 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:33:27 compute-1 nova_compute[183403]: 2026-01-26 15:33:27.094 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:33:27 compute-1 nova_compute[183403]: 2026-01-26 15:33:27.094 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:33:27 compute-1 nova_compute[183403]: 2026-01-26 15:33:27.094 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:33:27 compute-1 nova_compute[183403]: 2026-01-26 15:33:27.303 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:33:27 compute-1 nova_compute[183403]: 2026-01-26 15:33:27.304 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:33:27 compute-1 nova_compute[183403]: 2026-01-26 15:33:27.341 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.037s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:33:27 compute-1 nova_compute[183403]: 2026-01-26 15:33:27.342 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5835MB free_disk=73.1448860168457GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:33:27 compute-1 nova_compute[183403]: 2026-01-26 15:33:27.342 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:33:27 compute-1 nova_compute[183403]: 2026-01-26 15:33:27.343 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:33:28 compute-1 nova_compute[183403]: 2026-01-26 15:33:28.388 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:33:28 compute-1 nova_compute[183403]: 2026-01-26 15:33:28.389 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:33:27 up  1:28,  0 user,  load average: 0.22, 0.18, 0.23\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:33:28 compute-1 nova_compute[183403]: 2026-01-26 15:33:28.452 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:33:28 compute-1 nova_compute[183403]: 2026-01-26 15:33:28.477 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:33:28 compute-1 podman[213056]: 2026-01-26 15:33:28.923503078 +0000 UTC m=+0.094950234 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 15:33:28 compute-1 podman[213057]: 2026-01-26 15:33:28.929420498 +0000 UTC m=+0.095049017 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, managed_by=edpm_ansible, version=9.6, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-type=git, architecture=x86_64, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, distribution-scope=public, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 26 15:33:28 compute-1 nova_compute[183403]: 2026-01-26 15:33:28.960 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:33:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:33:29.086 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:33:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:33:29.086 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:33:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:33:29.086 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:33:29 compute-1 nova_compute[183403]: 2026-01-26 15:33:29.470 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:33:29 compute-1 nova_compute[183403]: 2026-01-26 15:33:29.470 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.128s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:33:30 compute-1 nova_compute[183403]: 2026-01-26 15:33:30.469 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:33:30 compute-1 nova_compute[183403]: 2026-01-26 15:33:30.470 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:33:30 compute-1 nova_compute[183403]: 2026-01-26 15:33:30.470 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 15:33:30 compute-1 nova_compute[183403]: 2026-01-26 15:33:30.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:33:30 compute-1 nova_compute[183403]: 2026-01-26 15:33:30.577 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:33:31 compute-1 nova_compute[183403]: 2026-01-26 15:33:31.573 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:33:31 compute-1 nova_compute[183403]: 2026-01-26 15:33:31.893 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:33:33 compute-1 nova_compute[183403]: 2026-01-26 15:33:33.479 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:33:35 compute-1 podman[192725]: time="2026-01-26T15:33:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:33:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:33:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:33:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:33:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2195 "" "Go-http-client/1.1"
Jan 26 15:33:36 compute-1 nova_compute[183403]: 2026-01-26 15:33:36.900 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:33:38 compute-1 nova_compute[183403]: 2026-01-26 15:33:38.486 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:33:39 compute-1 podman[213101]: 2026-01-26 15:33:39.880241707 +0000 UTC m=+0.048950752 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 26 15:33:39 compute-1 podman[213100]: 2026-01-26 15:33:39.908472799 +0000 UTC m=+0.078162391 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260120, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible)
Jan 26 15:33:41 compute-1 nova_compute[183403]: 2026-01-26 15:33:41.901 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:33:43 compute-1 nova_compute[183403]: 2026-01-26 15:33:43.488 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:33:46 compute-1 nova_compute[183403]: 2026-01-26 15:33:46.916 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:33:48 compute-1 nova_compute[183403]: 2026-01-26 15:33:48.491 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:33:48 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:33:48.513 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=30, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:ac:38', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'de:96:40:90:f3:e5'}, ipsec=False) old=SB_Global(nb_cfg=29) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:33:48 compute-1 nova_compute[183403]: 2026-01-26 15:33:48.514 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:33:48 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:33:48.514 104930 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 15:33:49 compute-1 openstack_network_exporter[195610]: ERROR   15:33:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:33:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:33:49 compute-1 openstack_network_exporter[195610]: ERROR   15:33:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:33:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:33:49 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:33:49.516 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=41380b5a-e321-4ce4-bcc6-ecd563b3c793, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '30'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:33:51 compute-1 nova_compute[183403]: 2026-01-26 15:33:51.919 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:33:53 compute-1 nova_compute[183403]: 2026-01-26 15:33:53.493 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:33:56 compute-1 nova_compute[183403]: 2026-01-26 15:33:56.921 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:33:58 compute-1 nova_compute[183403]: 2026-01-26 15:33:58.496 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:33:59 compute-1 podman[213146]: 2026-01-26 15:33:59.913741776 +0000 UTC m=+0.088657503 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 26 15:33:59 compute-1 podman[213147]: 2026-01-26 15:33:59.923032748 +0000 UTC m=+0.082790816 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.component=ubi9-minimal-container, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.openshift.expose-services=, version=9.6, maintainer=Red Hat, Inc., release=1755695350, io.buildah.version=1.33.7, vendor=Red Hat, Inc.)
Jan 26 15:34:01 compute-1 nova_compute[183403]: 2026-01-26 15:34:01.923 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:34:03 compute-1 nova_compute[183403]: 2026-01-26 15:34:03.497 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:34:05 compute-1 podman[192725]: time="2026-01-26T15:34:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:34:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:34:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:34:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:34:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2192 "" "Go-http-client/1.1"
Jan 26 15:34:06 compute-1 nova_compute[183403]: 2026-01-26 15:34:06.925 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:34:08 compute-1 nova_compute[183403]: 2026-01-26 15:34:08.499 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:34:10 compute-1 podman[213195]: 2026-01-26 15:34:10.925961774 +0000 UTC m=+0.086242878 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 15:34:10 compute-1 podman[213194]: 2026-01-26 15:34:10.960305471 +0000 UTC m=+0.130801822 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Jan 26 15:34:11 compute-1 nova_compute[183403]: 2026-01-26 15:34:11.926 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:34:13 compute-1 nova_compute[183403]: 2026-01-26 15:34:13.502 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:34:16 compute-1 nova_compute[183403]: 2026-01-26 15:34:16.928 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:34:18 compute-1 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 26 15:34:18 compute-1 nova_compute[183403]: 2026-01-26 15:34:18.505 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:34:19 compute-1 openstack_network_exporter[195610]: ERROR   15:34:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:34:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:34:19 compute-1 openstack_network_exporter[195610]: ERROR   15:34:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:34:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:34:19 compute-1 nova_compute[183403]: 2026-01-26 15:34:19.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:34:21 compute-1 nova_compute[183403]: 2026-01-26 15:34:21.929 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:34:23 compute-1 nova_compute[183403]: 2026-01-26 15:34:23.519 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:34:23 compute-1 nova_compute[183403]: 2026-01-26 15:34:23.575 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:34:26 compute-1 nova_compute[183403]: 2026-01-26 15:34:26.575 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:34:26 compute-1 nova_compute[183403]: 2026-01-26 15:34:26.930 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:34:27 compute-1 nova_compute[183403]: 2026-01-26 15:34:27.093 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:34:27 compute-1 nova_compute[183403]: 2026-01-26 15:34:27.094 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:34:27 compute-1 nova_compute[183403]: 2026-01-26 15:34:27.094 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:34:27 compute-1 nova_compute[183403]: 2026-01-26 15:34:27.094 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:34:27 compute-1 nova_compute[183403]: 2026-01-26 15:34:27.271 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:34:27 compute-1 nova_compute[183403]: 2026-01-26 15:34:27.273 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:34:27 compute-1 nova_compute[183403]: 2026-01-26 15:34:27.310 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.037s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:34:27 compute-1 nova_compute[183403]: 2026-01-26 15:34:27.310 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5841MB free_disk=73.14504623413086GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:34:27 compute-1 nova_compute[183403]: 2026-01-26 15:34:27.311 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:34:27 compute-1 nova_compute[183403]: 2026-01-26 15:34:27.311 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:34:27 compute-1 nova_compute[183403]: 2026-01-26 15:34:27.397 183407 DEBUG nova.virt.libvirt.driver [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 4f73a02a-fb26-4967-89d0-d1f3bba3c8cc] Creating tmpfile /var/lib/nova/instances/tmp7vcunsmv to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Jan 26 15:34:27 compute-1 nova_compute[183403]: 2026-01-26 15:34:27.398 183407 WARNING neutronclient.v2_0.client [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:34:27 compute-1 nova_compute[183403]: 2026-01-26 15:34:27.402 183407 DEBUG nova.compute.manager [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp7vcunsmv',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9090
Jan 26 15:34:27 compute-1 nova_compute[183403]: 2026-01-26 15:34:27.443 183407 DEBUG nova.virt.libvirt.driver [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee] Creating tmpfile /var/lib/nova/instances/tmpcyraiacu to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Jan 26 15:34:27 compute-1 nova_compute[183403]: 2026-01-26 15:34:27.443 183407 WARNING neutronclient.v2_0.client [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:34:27 compute-1 nova_compute[183403]: 2026-01-26 15:34:27.447 183407 DEBUG nova.compute.manager [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpcyraiacu',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9090
Jan 26 15:34:28 compute-1 nova_compute[183403]: 2026-01-26 15:34:28.523 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:34:28 compute-1 nova_compute[183403]: 2026-01-26 15:34:28.871 183407 WARNING nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Instance 4f73a02a-fb26-4967-89d0-d1f3bba3c8cc has been moved to another host compute-0.ctlplane.example.com(compute-0.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Jan 26 15:34:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:34:29.087 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:34:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:34:29.088 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:34:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:34:29.088 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:34:29 compute-1 nova_compute[183403]: 2026-01-26 15:34:29.385 183407 WARNING nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Instance ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee has been moved to another host compute-0.ctlplane.example.com(compute-0.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Jan 26 15:34:29 compute-1 nova_compute[183403]: 2026-01-26 15:34:29.385 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:34:29 compute-1 nova_compute[183403]: 2026-01-26 15:34:29.386 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:34:27 up  1:29,  0 user,  load average: 0.15, 0.16, 0.21\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:34:29 compute-1 nova_compute[183403]: 2026-01-26 15:34:29.434 183407 WARNING neutronclient.v2_0.client [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:34:29 compute-1 nova_compute[183403]: 2026-01-26 15:34:29.443 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:34:29 compute-1 nova_compute[183403]: 2026-01-26 15:34:29.477 183407 WARNING neutronclient.v2_0.client [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:34:29 compute-1 nova_compute[183403]: 2026-01-26 15:34:29.949 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:34:30 compute-1 nova_compute[183403]: 2026-01-26 15:34:30.459 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:34:30 compute-1 nova_compute[183403]: 2026-01-26 15:34:30.460 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.149s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:34:30 compute-1 podman[213240]: 2026-01-26 15:34:30.887808831 +0000 UTC m=+0.064157803 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 15:34:30 compute-1 podman[213241]: 2026-01-26 15:34:30.9066248 +0000 UTC m=+0.078454899 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., version=9.6, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, architecture=x86_64)
Jan 26 15:34:31 compute-1 nova_compute[183403]: 2026-01-26 15:34:31.933 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:34:32 compute-1 nova_compute[183403]: 2026-01-26 15:34:32.460 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:34:32 compute-1 nova_compute[183403]: 2026-01-26 15:34:32.461 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:34:32 compute-1 nova_compute[183403]: 2026-01-26 15:34:32.461 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:34:32 compute-1 nova_compute[183403]: 2026-01-26 15:34:32.462 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:34:32 compute-1 nova_compute[183403]: 2026-01-26 15:34:32.462 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:34:32 compute-1 nova_compute[183403]: 2026-01-26 15:34:32.462 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 15:34:33 compute-1 nova_compute[183403]: 2026-01-26 15:34:33.527 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:34:35 compute-1 nova_compute[183403]: 2026-01-26 15:34:35.633 183407 DEBUG nova.compute.manager [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp7vcunsmv',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='4f73a02a-fb26-4967-89d0-d1f3bba3c8cc',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9315
Jan 26 15:34:35 compute-1 podman[192725]: time="2026-01-26T15:34:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:34:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:34:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:34:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:34:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2198 "" "Go-http-client/1.1"
Jan 26 15:34:36 compute-1 nova_compute[183403]: 2026-01-26 15:34:36.654 183407 DEBUG oslo_concurrency.lockutils [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "refresh_cache-4f73a02a-fb26-4967-89d0-d1f3bba3c8cc" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 15:34:36 compute-1 nova_compute[183403]: 2026-01-26 15:34:36.655 183407 DEBUG oslo_concurrency.lockutils [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquired lock "refresh_cache-4f73a02a-fb26-4967-89d0-d1f3bba3c8cc" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 15:34:36 compute-1 nova_compute[183403]: 2026-01-26 15:34:36.655 183407 DEBUG nova.network.neutron [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 4f73a02a-fb26-4967-89d0-d1f3bba3c8cc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 15:34:36 compute-1 nova_compute[183403]: 2026-01-26 15:34:36.933 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:34:37 compute-1 nova_compute[183403]: 2026-01-26 15:34:37.163 183407 WARNING neutronclient.v2_0.client [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:34:37 compute-1 nova_compute[183403]: 2026-01-26 15:34:37.572 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:34:37 compute-1 nova_compute[183403]: 2026-01-26 15:34:37.583 183407 WARNING neutronclient.v2_0.client [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:34:37 compute-1 nova_compute[183403]: 2026-01-26 15:34:37.778 183407 DEBUG nova.network.neutron [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 4f73a02a-fb26-4967-89d0-d1f3bba3c8cc] Updating instance_info_cache with network_info: [{"id": "3572491e-cb7e-4f70-b828-2f4d9fda6e48", "address": "fa:16:3e:8c:e6:21", "network": {"id": "cc98e8b1-8169-4a08-8b22-cd8a87c017a0", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-1789164558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a18d09bfe0e7479c8a237dd032889317", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3572491e-cb", "ovs_interfaceid": "3572491e-cb7e-4f70-b828-2f4d9fda6e48", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:34:38 compute-1 sshd-session[213052]: ssh_dispatch_run_fatal: Connection from 180.76.172.156 port 47490: Connection timed out [preauth]
Jan 26 15:34:38 compute-1 nova_compute[183403]: 2026-01-26 15:34:38.288 183407 DEBUG oslo_concurrency.lockutils [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Releasing lock "refresh_cache-4f73a02a-fb26-4967-89d0-d1f3bba3c8cc" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 15:34:38 compute-1 nova_compute[183403]: 2026-01-26 15:34:38.309 183407 DEBUG nova.virt.libvirt.driver [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 4f73a02a-fb26-4967-89d0-d1f3bba3c8cc] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp7vcunsmv',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='4f73a02a-fb26-4967-89d0-d1f3bba3c8cc',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Jan 26 15:34:38 compute-1 nova_compute[183403]: 2026-01-26 15:34:38.310 183407 DEBUG nova.virt.libvirt.driver [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 4f73a02a-fb26-4967-89d0-d1f3bba3c8cc] Creating instance directory: /var/lib/nova/instances/4f73a02a-fb26-4967-89d0-d1f3bba3c8cc pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Jan 26 15:34:38 compute-1 nova_compute[183403]: 2026-01-26 15:34:38.310 183407 DEBUG nova.virt.libvirt.driver [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 4f73a02a-fb26-4967-89d0-d1f3bba3c8cc] Creating disk.info with the contents: {'/var/lib/nova/instances/4f73a02a-fb26-4967-89d0-d1f3bba3c8cc/disk': 'qcow2', '/var/lib/nova/instances/4f73a02a-fb26-4967-89d0-d1f3bba3c8cc/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Jan 26 15:34:38 compute-1 nova_compute[183403]: 2026-01-26 15:34:38.310 183407 DEBUG nova.virt.libvirt.driver [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 4f73a02a-fb26-4967-89d0-d1f3bba3c8cc] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Jan 26 15:34:38 compute-1 nova_compute[183403]: 2026-01-26 15:34:38.311 183407 DEBUG nova.objects.instance [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lazy-loading 'trusted_certs' on Instance uuid 4f73a02a-fb26-4967-89d0-d1f3bba3c8cc obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:34:38 compute-1 nova_compute[183403]: 2026-01-26 15:34:38.529 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:34:38 compute-1 nova_compute[183403]: 2026-01-26 15:34:38.816 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:34:38 compute-1 nova_compute[183403]: 2026-01-26 15:34:38.823 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:34:38 compute-1 nova_compute[183403]: 2026-01-26 15:34:38.825 183407 DEBUG oslo_concurrency.processutils [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:34:38 compute-1 nova_compute[183403]: 2026-01-26 15:34:38.903 183407 DEBUG oslo_concurrency.processutils [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:34:38 compute-1 nova_compute[183403]: 2026-01-26 15:34:38.904 183407 DEBUG oslo_concurrency.lockutils [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:34:38 compute-1 nova_compute[183403]: 2026-01-26 15:34:38.904 183407 DEBUG oslo_concurrency.lockutils [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:34:38 compute-1 nova_compute[183403]: 2026-01-26 15:34:38.905 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:34:38 compute-1 nova_compute[183403]: 2026-01-26 15:34:38.908 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:34:38 compute-1 nova_compute[183403]: 2026-01-26 15:34:38.908 183407 DEBUG oslo_concurrency.processutils [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:34:38 compute-1 nova_compute[183403]: 2026-01-26 15:34:38.971 183407 DEBUG oslo_concurrency.processutils [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:34:38 compute-1 nova_compute[183403]: 2026-01-26 15:34:38.972 183407 DEBUG oslo_concurrency.processutils [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0,backing_fmt=raw /var/lib/nova/instances/4f73a02a-fb26-4967-89d0-d1f3bba3c8cc/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:34:39 compute-1 nova_compute[183403]: 2026-01-26 15:34:39.020 183407 DEBUG oslo_concurrency.processutils [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0,backing_fmt=raw /var/lib/nova/instances/4f73a02a-fb26-4967-89d0-d1f3bba3c8cc/disk 1073741824" returned: 0 in 0.049s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:34:39 compute-1 nova_compute[183403]: 2026-01-26 15:34:39.022 183407 DEBUG oslo_concurrency.lockutils [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.117s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:34:39 compute-1 nova_compute[183403]: 2026-01-26 15:34:39.023 183407 DEBUG oslo_concurrency.processutils [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:34:39 compute-1 nova_compute[183403]: 2026-01-26 15:34:39.075 183407 DEBUG oslo_concurrency.processutils [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:34:39 compute-1 nova_compute[183403]: 2026-01-26 15:34:39.077 183407 DEBUG nova.virt.disk.api [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Checking if we can resize image /var/lib/nova/instances/4f73a02a-fb26-4967-89d0-d1f3bba3c8cc/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 26 15:34:39 compute-1 nova_compute[183403]: 2026-01-26 15:34:39.077 183407 DEBUG oslo_concurrency.processutils [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4f73a02a-fb26-4967-89d0-d1f3bba3c8cc/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:34:39 compute-1 nova_compute[183403]: 2026-01-26 15:34:39.129 183407 DEBUG oslo_concurrency.processutils [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4f73a02a-fb26-4967-89d0-d1f3bba3c8cc/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:34:39 compute-1 nova_compute[183403]: 2026-01-26 15:34:39.131 183407 DEBUG nova.virt.disk.api [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Cannot resize image /var/lib/nova/instances/4f73a02a-fb26-4967-89d0-d1f3bba3c8cc/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 26 15:34:39 compute-1 nova_compute[183403]: 2026-01-26 15:34:39.132 183407 DEBUG nova.objects.instance [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lazy-loading 'migration_context' on Instance uuid 4f73a02a-fb26-4967-89d0-d1f3bba3c8cc obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:34:39 compute-1 nova_compute[183403]: 2026-01-26 15:34:39.641 183407 DEBUG nova.objects.base [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Object Instance<4f73a02a-fb26-4967-89d0-d1f3bba3c8cc> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Jan 26 15:34:39 compute-1 nova_compute[183403]: 2026-01-26 15:34:39.641 183407 DEBUG oslo_concurrency.processutils [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/4f73a02a-fb26-4967-89d0-d1f3bba3c8cc/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:34:39 compute-1 nova_compute[183403]: 2026-01-26 15:34:39.678 183407 DEBUG oslo_concurrency.processutils [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/4f73a02a-fb26-4967-89d0-d1f3bba3c8cc/disk.config 497664" returned: 0 in 0.036s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:34:39 compute-1 nova_compute[183403]: 2026-01-26 15:34:39.678 183407 DEBUG nova.virt.libvirt.driver [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 4f73a02a-fb26-4967-89d0-d1f3bba3c8cc] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Jan 26 15:34:39 compute-1 nova_compute[183403]: 2026-01-26 15:34:39.680 183407 DEBUG nova.virt.libvirt.vif [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-26T15:33:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-548888713',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-5488887',id=27,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T15:33:57Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='8f7ddd8ab2ae4841a1f43ae8078bb924',ramdisk_id='',reservation_id='r-328l48ai',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,reader,admin,member',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-1156482628',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-1156482628-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T15:33:57Z,user_data=None,user_id='8152e350b54f44cabafc751c752d6f92',uuid=4f73a02a-fb26-4967-89d0-d1f3bba3c8cc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3572491e-cb7e-4f70-b828-2f4d9fda6e48", "address": "fa:16:3e:8c:e6:21", "network": {"id": "cc98e8b1-8169-4a08-8b22-cd8a87c017a0", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-1789164558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a18d09bfe0e7479c8a237dd032889317", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap3572491e-cb", "ovs_interfaceid": "3572491e-cb7e-4f70-b828-2f4d9fda6e48", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 26 15:34:39 compute-1 nova_compute[183403]: 2026-01-26 15:34:39.680 183407 DEBUG nova.network.os_vif_util [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Converting VIF {"id": "3572491e-cb7e-4f70-b828-2f4d9fda6e48", "address": "fa:16:3e:8c:e6:21", "network": {"id": "cc98e8b1-8169-4a08-8b22-cd8a87c017a0", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-1789164558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a18d09bfe0e7479c8a237dd032889317", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap3572491e-cb", "ovs_interfaceid": "3572491e-cb7e-4f70-b828-2f4d9fda6e48", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:34:39 compute-1 nova_compute[183403]: 2026-01-26 15:34:39.681 183407 DEBUG nova.network.os_vif_util [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8c:e6:21,bridge_name='br-int',has_traffic_filtering=True,id=3572491e-cb7e-4f70-b828-2f4d9fda6e48,network=Network(cc98e8b1-8169-4a08-8b22-cd8a87c017a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3572491e-cb') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:34:39 compute-1 nova_compute[183403]: 2026-01-26 15:34:39.681 183407 DEBUG os_vif [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8c:e6:21,bridge_name='br-int',has_traffic_filtering=True,id=3572491e-cb7e-4f70-b828-2f4d9fda6e48,network=Network(cc98e8b1-8169-4a08-8b22-cd8a87c017a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3572491e-cb') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 26 15:34:39 compute-1 nova_compute[183403]: 2026-01-26 15:34:39.682 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:34:39 compute-1 nova_compute[183403]: 2026-01-26 15:34:39.682 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:34:39 compute-1 nova_compute[183403]: 2026-01-26 15:34:39.683 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:34:39 compute-1 nova_compute[183403]: 2026-01-26 15:34:39.683 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:34:39 compute-1 nova_compute[183403]: 2026-01-26 15:34:39.684 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '0ebddd95-240d-58ce-89a4-2efb081a39c7', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:34:39 compute-1 nova_compute[183403]: 2026-01-26 15:34:39.685 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:34:39 compute-1 nova_compute[183403]: 2026-01-26 15:34:39.687 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:34:39 compute-1 nova_compute[183403]: 2026-01-26 15:34:39.691 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:34:39 compute-1 nova_compute[183403]: 2026-01-26 15:34:39.691 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3572491e-cb, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:34:39 compute-1 nova_compute[183403]: 2026-01-26 15:34:39.691 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap3572491e-cb, col_values=(('qos', UUID('926436e1-d2fd-4ea5-9470-f5abc42b1cb0')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:34:39 compute-1 nova_compute[183403]: 2026-01-26 15:34:39.692 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap3572491e-cb, col_values=(('external_ids', {'iface-id': '3572491e-cb7e-4f70-b828-2f4d9fda6e48', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8c:e6:21', 'vm-uuid': '4f73a02a-fb26-4967-89d0-d1f3bba3c8cc'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:34:39 compute-1 nova_compute[183403]: 2026-01-26 15:34:39.693 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:34:39 compute-1 NetworkManager[55716]: <info>  [1769441679.6946] manager: (tap3572491e-cb): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/75)
Jan 26 15:34:39 compute-1 nova_compute[183403]: 2026-01-26 15:34:39.695 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:34:39 compute-1 nova_compute[183403]: 2026-01-26 15:34:39.703 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:34:39 compute-1 nova_compute[183403]: 2026-01-26 15:34:39.703 183407 INFO os_vif [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8c:e6:21,bridge_name='br-int',has_traffic_filtering=True,id=3572491e-cb7e-4f70-b828-2f4d9fda6e48,network=Network(cc98e8b1-8169-4a08-8b22-cd8a87c017a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3572491e-cb')
Jan 26 15:34:39 compute-1 nova_compute[183403]: 2026-01-26 15:34:39.704 183407 DEBUG nova.virt.libvirt.driver [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Jan 26 15:34:39 compute-1 nova_compute[183403]: 2026-01-26 15:34:39.704 183407 DEBUG nova.compute.manager [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp7vcunsmv',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='4f73a02a-fb26-4967-89d0-d1f3bba3c8cc',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9381
Jan 26 15:34:39 compute-1 nova_compute[183403]: 2026-01-26 15:34:39.705 183407 WARNING neutronclient.v2_0.client [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:34:40 compute-1 nova_compute[183403]: 2026-01-26 15:34:40.252 183407 WARNING neutronclient.v2_0.client [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:34:40 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:34:40.527 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=31, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:ac:38', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'de:96:40:90:f3:e5'}, ipsec=False) old=SB_Global(nb_cfg=30) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:34:40 compute-1 nova_compute[183403]: 2026-01-26 15:34:40.527 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:34:40 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:34:40.528 104930 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 15:34:41 compute-1 nova_compute[183403]: 2026-01-26 15:34:41.649 183407 DEBUG nova.network.neutron [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 4f73a02a-fb26-4967-89d0-d1f3bba3c8cc] Port 3572491e-cb7e-4f70-b828-2f4d9fda6e48 updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Jan 26 15:34:41 compute-1 nova_compute[183403]: 2026-01-26 15:34:41.665 183407 DEBUG nova.compute.manager [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp7vcunsmv',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='4f73a02a-fb26-4967-89d0-d1f3bba3c8cc',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9447
Jan 26 15:34:41 compute-1 podman[213306]: 2026-01-26 15:34:41.927211714 +0000 UTC m=+0.096805303 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, config_id=ovn_metadata_agent, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2)
Jan 26 15:34:41 compute-1 nova_compute[183403]: 2026-01-26 15:34:41.935 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:34:41 compute-1 podman[213305]: 2026-01-26 15:34:41.966593207 +0000 UTC m=+0.141903252 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20260120)
Jan 26 15:34:43 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:34:43.531 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=41380b5a-e321-4ce4-bcc6-ecd563b3c793, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '31'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:34:44 compute-1 systemd[1]: Starting libvirt proxy daemon...
Jan 26 15:34:44 compute-1 systemd[1]: Started libvirt proxy daemon.
Jan 26 15:34:44 compute-1 kernel: tap3572491e-cb: entered promiscuous mode
Jan 26 15:34:44 compute-1 NetworkManager[55716]: <info>  [1769441684.4735] manager: (tap3572491e-cb): new Tun device (/org/freedesktop/NetworkManager/Devices/76)
Jan 26 15:34:44 compute-1 nova_compute[183403]: 2026-01-26 15:34:44.475 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:34:44 compute-1 ovn_controller[95641]: 2026-01-26T15:34:44Z|00195|binding|INFO|Claiming lport 3572491e-cb7e-4f70-b828-2f4d9fda6e48 for this additional chassis.
Jan 26 15:34:44 compute-1 ovn_controller[95641]: 2026-01-26T15:34:44Z|00196|binding|INFO|3572491e-cb7e-4f70-b828-2f4d9fda6e48: Claiming fa:16:3e:8c:e6:21 10.100.0.9
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:34:44.495 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8c:e6:21 10.100.0.9'], port_security=['fa:16:3e:8c:e6:21 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '4f73a02a-fb26-4967-89d0-d1f3bba3c8cc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc98e8b1-8169-4a08-8b22-cd8a87c017a0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f7ddd8ab2ae4841a1f43ae8078bb924', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'c563489a-e307-4381-b22d-8f22c6dbbfd6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=76661f47-b7c7-4131-9a1a-0f8828404115, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[], logical_port=3572491e-cb7e-4f70-b828-2f4d9fda6e48) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:34:44.496 104930 INFO neutron.agent.ovn.metadata.agent [-] Port 3572491e-cb7e-4f70-b828-2f4d9fda6e48 in datapath cc98e8b1-8169-4a08-8b22-cd8a87c017a0 unbound from our chassis
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:34:44.497 104930 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network cc98e8b1-8169-4a08-8b22-cd8a87c017a0
Jan 26 15:34:44 compute-1 systemd-udevd[213381]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:34:44.521 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[a38bbcc1-9312-4194-86b0-94993275980f]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:34:44.522 104930 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapcc98e8b1-81 in ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:34:44.525 203506 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapcc98e8b1-80 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:34:44.525 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[1858e629-b156-46cb-8899-f5e03ea83218]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:34:44.527 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[f44f17eb-d17c-4b72-a1f3-b4b75e40883f]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:34:44 compute-1 NetworkManager[55716]: <info>  [1769441684.5334] device (tap3572491e-cb): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 15:34:44 compute-1 NetworkManager[55716]: <info>  [1769441684.5345] device (tap3572491e-cb): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 15:34:44 compute-1 systemd-machined[154697]: New machine qemu-18-instance-0000001b.
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:34:44.545 105448 DEBUG oslo.privsep.daemon [-] privsep: reply[a679d914-cfa8-41e2-ac76-ef893be6969c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:34:44 compute-1 nova_compute[183403]: 2026-01-26 15:34:44.548 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:34:44 compute-1 nova_compute[183403]: 2026-01-26 15:34:44.552 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:34:44 compute-1 ovn_controller[95641]: 2026-01-26T15:34:44Z|00197|binding|INFO|Setting lport 3572491e-cb7e-4f70-b828-2f4d9fda6e48 ovn-installed in OVS
Jan 26 15:34:44 compute-1 nova_compute[183403]: 2026-01-26 15:34:44.556 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:34:44 compute-1 systemd[1]: Started Virtual Machine qemu-18-instance-0000001b.
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:34:44.564 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[e5c52cda-cc65-4018-8c4d-fc9257d6409f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:34:44.601 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[f9e4073d-4762-4c0d-84c1-1bf7f435194c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:34:44.607 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[85f55c4f-bfc9-4c7b-a5c4-863b03d4fe97]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:34:44 compute-1 NetworkManager[55716]: <info>  [1769441684.6080] manager: (tapcc98e8b1-80): new Veth device (/org/freedesktop/NetworkManager/Devices/77)
Jan 26 15:34:44 compute-1 systemd-udevd[213386]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:34:44.641 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[b83e4242-26da-4e19-9b8f-7d00da2704cb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:34:44.644 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[46b30d79-1bbc-48f9-94a3-ca23cc8f2267]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:34:44 compute-1 NetworkManager[55716]: <info>  [1769441684.6760] device (tapcc98e8b1-80): carrier: link connected
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:34:44.683 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[4d830f25-3a9f-4486-bee3-5026e52a9105]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:34:44 compute-1 nova_compute[183403]: 2026-01-26 15:34:44.693 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:34:44.699 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[796b515d-4ff1-47e7-a48c-4db463eec6bf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcc98e8b1-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:92:f7:3a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 540728, 'reachable_time': 21508, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213416, 'error': None, 'target': 'ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:34:44.715 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[87966794-41de-4baa-814a-c7c887678d2a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe92:f73a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 540728, 'tstamp': 540728}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213417, 'error': None, 'target': 'ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:34:44.732 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[605c4617-b86a-40e3-a53d-16fd129e3a90]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcc98e8b1-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:92:f7:3a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 540728, 'reachable_time': 21508, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213418, 'error': None, 'target': 'ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:34:44.769 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[bc3eff33-618a-486f-a2ef-cc3533256677]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:34:44.866 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[57a2fc81-7e37-4959-854e-6137f9c20e5f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:34:44.868 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcc98e8b1-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:34:44.868 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:34:44.869 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcc98e8b1-80, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:34:44 compute-1 NetworkManager[55716]: <info>  [1769441684.8727] manager: (tapcc98e8b1-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/78)
Jan 26 15:34:44 compute-1 nova_compute[183403]: 2026-01-26 15:34:44.872 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:34:44 compute-1 kernel: tapcc98e8b1-80: entered promiscuous mode
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:34:44.878 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapcc98e8b1-80, col_values=(('external_ids', {'iface-id': '219edf33-3765-4d7c-87b5-4ab0ed1d6a8a'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:34:44 compute-1 ovn_controller[95641]: 2026-01-26T15:34:44Z|00198|binding|INFO|Releasing lport 219edf33-3765-4d7c-87b5-4ab0ed1d6a8a from this chassis (sb_readonly=0)
Jan 26 15:34:44 compute-1 nova_compute[183403]: 2026-01-26 15:34:44.880 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:34:44.884 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[dbbfbcd5-a17a-497c-8464-7bd61169c162]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:34:44.885 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/cc98e8b1-8169-4a08-8b22-cd8a87c017a0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/cc98e8b1-8169-4a08-8b22-cd8a87c017a0.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:34:44.885 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/cc98e8b1-8169-4a08-8b22-cd8a87c017a0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/cc98e8b1-8169-4a08-8b22-cd8a87c017a0.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:34:44.885 104930 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for cc98e8b1-8169-4a08-8b22-cd8a87c017a0 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:34:44.885 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/cc98e8b1-8169-4a08-8b22-cd8a87c017a0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/cc98e8b1-8169-4a08-8b22-cd8a87c017a0.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:34:44.886 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[6b94b14c-6b47-47a0-b6da-381dc9afc035]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:34:44.887 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/cc98e8b1-8169-4a08-8b22-cd8a87c017a0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/cc98e8b1-8169-4a08-8b22-cd8a87c017a0.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:34:44.888 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[63c5b428-939c-42c3-a250-800d42b2470d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:34:44.888 104930 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]: global
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]:     log         /dev/log local0 debug
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]:     log-tag     haproxy-metadata-proxy-cc98e8b1-8169-4a08-8b22-cd8a87c017a0
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]:     user        root
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]:     group       root
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]:     maxconn     1024
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]:     pidfile     /var/lib/neutron/external/pids/cc98e8b1-8169-4a08-8b22-cd8a87c017a0.pid.haproxy
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]:     daemon
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]: 
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]: defaults
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]:     log global
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]:     mode http
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]:     option httplog
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]:     option dontlognull
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]:     option http-server-close
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]:     option forwardfor
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]:     retries                 3
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]:     timeout http-request    30s
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]:     timeout connect         30s
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]:     timeout client          32s
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]:     timeout server          32s
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]:     timeout http-keep-alive 30s
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]: 
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]: listen listener
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]:     bind 169.254.169.254:80
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]:     
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]:     server metadata /var/lib/neutron/metadata_proxy
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]: 
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]:     http-request add-header X-OVN-Network-ID cc98e8b1-8169-4a08-8b22-cd8a87c017a0
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Jan 26 15:34:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:34:44.889 104930 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0', 'env', 'PROCESS_TAG=haproxy-cc98e8b1-8169-4a08-8b22-cd8a87c017a0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/cc98e8b1-8169-4a08-8b22-cd8a87c017a0.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Jan 26 15:34:44 compute-1 nova_compute[183403]: 2026-01-26 15:34:44.892 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:34:45 compute-1 podman[213457]: 2026-01-26 15:34:45.306938607 +0000 UTC m=+0.037133424 image pull d5bf96c5225682608353c2a38183b39c74c7c48343b54a579b3b6f3d81996637 38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Jan 26 15:34:45 compute-1 podman[213457]: 2026-01-26 15:34:45.602476623 +0000 UTC m=+0.332671380 container create 9289b0f3269e50bc6af619a37d18a76412c9d9bf61368da60ac66e8610a8fa39 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 15:34:45 compute-1 systemd[1]: Started libpod-conmon-9289b0f3269e50bc6af619a37d18a76412c9d9bf61368da60ac66e8610a8fa39.scope.
Jan 26 15:34:45 compute-1 systemd[1]: Started libcrun container.
Jan 26 15:34:45 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4db4710a879ec7acf1d6973db45a34b9b7305f0972f8ee1099ab3b086690a9aa/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 15:34:45 compute-1 podman[213457]: 2026-01-26 15:34:45.795283167 +0000 UTC m=+0.525478414 container init 9289b0f3269e50bc6af619a37d18a76412c9d9bf61368da60ac66e8610a8fa39 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, tcib_build_tag=watcher_latest)
Jan 26 15:34:45 compute-1 podman[213457]: 2026-01-26 15:34:45.806940542 +0000 UTC m=+0.537135339 container start 9289b0f3269e50bc6af619a37d18a76412c9d9bf61368da60ac66e8610a8fa39 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 26 15:34:45 compute-1 neutron-haproxy-ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0[213470]: [NOTICE]   (213474) : New worker (213476) forked
Jan 26 15:34:45 compute-1 neutron-haproxy-ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0[213470]: [NOTICE]   (213474) : Loading success.
Jan 26 15:34:46 compute-1 nova_compute[183403]: 2026-01-26 15:34:46.981 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:34:48 compute-1 ovn_controller[95641]: 2026-01-26T15:34:48Z|00199|binding|INFO|Claiming lport 3572491e-cb7e-4f70-b828-2f4d9fda6e48 for this chassis.
Jan 26 15:34:48 compute-1 ovn_controller[95641]: 2026-01-26T15:34:48Z|00200|binding|INFO|3572491e-cb7e-4f70-b828-2f4d9fda6e48: Claiming fa:16:3e:8c:e6:21 10.100.0.9
Jan 26 15:34:48 compute-1 ovn_controller[95641]: 2026-01-26T15:34:48Z|00201|binding|INFO|Setting lport 3572491e-cb7e-4f70-b828-2f4d9fda6e48 up in Southbound
Jan 26 15:34:49 compute-1 openstack_network_exporter[195610]: ERROR   15:34:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:34:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:34:49 compute-1 openstack_network_exporter[195610]: ERROR   15:34:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:34:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:34:49 compute-1 nova_compute[183403]: 2026-01-26 15:34:49.695 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:34:49 compute-1 nova_compute[183403]: 2026-01-26 15:34:49.813 183407 INFO nova.compute.manager [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 4f73a02a-fb26-4967-89d0-d1f3bba3c8cc] Post operation of migration started
Jan 26 15:34:49 compute-1 nova_compute[183403]: 2026-01-26 15:34:49.813 183407 WARNING neutronclient.v2_0.client [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:34:49 compute-1 nova_compute[183403]: 2026-01-26 15:34:49.924 183407 WARNING neutronclient.v2_0.client [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:34:49 compute-1 nova_compute[183403]: 2026-01-26 15:34:49.924 183407 WARNING neutronclient.v2_0.client [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:34:50 compute-1 nova_compute[183403]: 2026-01-26 15:34:50.253 183407 DEBUG oslo_concurrency.lockutils [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "refresh_cache-4f73a02a-fb26-4967-89d0-d1f3bba3c8cc" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 15:34:50 compute-1 nova_compute[183403]: 2026-01-26 15:34:50.253 183407 DEBUG oslo_concurrency.lockutils [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquired lock "refresh_cache-4f73a02a-fb26-4967-89d0-d1f3bba3c8cc" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 15:34:50 compute-1 nova_compute[183403]: 2026-01-26 15:34:50.253 183407 DEBUG nova.network.neutron [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 4f73a02a-fb26-4967-89d0-d1f3bba3c8cc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 15:34:50 compute-1 nova_compute[183403]: 2026-01-26 15:34:50.759 183407 WARNING neutronclient.v2_0.client [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:34:51 compute-1 nova_compute[183403]: 2026-01-26 15:34:51.433 183407 WARNING neutronclient.v2_0.client [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:34:51 compute-1 nova_compute[183403]: 2026-01-26 15:34:51.631 183407 DEBUG nova.network.neutron [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 4f73a02a-fb26-4967-89d0-d1f3bba3c8cc] Updating instance_info_cache with network_info: [{"id": "3572491e-cb7e-4f70-b828-2f4d9fda6e48", "address": "fa:16:3e:8c:e6:21", "network": {"id": "cc98e8b1-8169-4a08-8b22-cd8a87c017a0", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-1789164558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a18d09bfe0e7479c8a237dd032889317", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3572491e-cb", "ovs_interfaceid": "3572491e-cb7e-4f70-b828-2f4d9fda6e48", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:34:51 compute-1 nova_compute[183403]: 2026-01-26 15:34:51.982 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:34:52 compute-1 nova_compute[183403]: 2026-01-26 15:34:52.141 183407 DEBUG oslo_concurrency.lockutils [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Releasing lock "refresh_cache-4f73a02a-fb26-4967-89d0-d1f3bba3c8cc" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 15:34:52 compute-1 nova_compute[183403]: 2026-01-26 15:34:52.682 183407 DEBUG oslo_concurrency.lockutils [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:34:52 compute-1 nova_compute[183403]: 2026-01-26 15:34:52.682 183407 DEBUG oslo_concurrency.lockutils [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:34:52 compute-1 nova_compute[183403]: 2026-01-26 15:34:52.683 183407 DEBUG oslo_concurrency.lockutils [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:34:52 compute-1 nova_compute[183403]: 2026-01-26 15:34:52.688 183407 INFO nova.virt.libvirt.driver [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 4f73a02a-fb26-4967-89d0-d1f3bba3c8cc] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Jan 26 15:34:52 compute-1 virtqemud[183290]: Domain id=18 name='instance-0000001b' uuid=4f73a02a-fb26-4967-89d0-d1f3bba3c8cc is tainted: custom-monitor
Jan 26 15:34:53 compute-1 nova_compute[183403]: 2026-01-26 15:34:53.698 183407 INFO nova.virt.libvirt.driver [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 4f73a02a-fb26-4967-89d0-d1f3bba3c8cc] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Jan 26 15:34:54 compute-1 nova_compute[183403]: 2026-01-26 15:34:54.698 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:34:54 compute-1 nova_compute[183403]: 2026-01-26 15:34:54.706 183407 INFO nova.virt.libvirt.driver [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 4f73a02a-fb26-4967-89d0-d1f3bba3c8cc] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Jan 26 15:34:54 compute-1 nova_compute[183403]: 2026-01-26 15:34:54.709 183407 DEBUG nova.compute.manager [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 4f73a02a-fb26-4967-89d0-d1f3bba3c8cc] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Jan 26 15:34:55 compute-1 nova_compute[183403]: 2026-01-26 15:34:55.223 183407 DEBUG nova.objects.instance [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 4f73a02a-fb26-4967-89d0-d1f3bba3c8cc] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Jan 26 15:34:56 compute-1 nova_compute[183403]: 2026-01-26 15:34:56.247 183407 WARNING neutronclient.v2_0.client [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:34:56 compute-1 nova_compute[183403]: 2026-01-26 15:34:56.750 183407 WARNING neutronclient.v2_0.client [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:34:56 compute-1 nova_compute[183403]: 2026-01-26 15:34:56.750 183407 WARNING neutronclient.v2_0.client [None req-f55fc739-896c-4fd8-89ce-c3768104d1d5 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:34:56 compute-1 nova_compute[183403]: 2026-01-26 15:34:56.985 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:34:59 compute-1 nova_compute[183403]: 2026-01-26 15:34:59.725 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:35:01 compute-1 podman[213495]: 2026-01-26 15:35:01.883656695 +0000 UTC m=+0.060750821 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 15:35:01 compute-1 podman[213496]: 2026-01-26 15:35:01.887957911 +0000 UTC m=+0.065245262 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., distribution-scope=public, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.openshift.expose-services=, managed_by=edpm_ansible, config_id=openstack_network_exporter, io.buildah.version=1.33.7, release=1755695350)
Jan 26 15:35:01 compute-1 nova_compute[183403]: 2026-01-26 15:35:01.986 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:35:04 compute-1 nova_compute[183403]: 2026-01-26 15:35:04.729 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:35:05 compute-1 nova_compute[183403]: 2026-01-26 15:35:05.084 183407 DEBUG nova.compute.manager [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpcyraiacu',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9315
Jan 26 15:35:05 compute-1 podman[192725]: time="2026-01-26T15:35:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:35:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:35:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16575 "" "Go-http-client/1.1"
Jan 26 15:35:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:35:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2658 "" "Go-http-client/1.1"
Jan 26 15:35:06 compute-1 nova_compute[183403]: 2026-01-26 15:35:06.302 183407 DEBUG oslo_concurrency.lockutils [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "refresh_cache-ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 15:35:06 compute-1 nova_compute[183403]: 2026-01-26 15:35:06.303 183407 DEBUG oslo_concurrency.lockutils [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquired lock "refresh_cache-ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 15:35:06 compute-1 nova_compute[183403]: 2026-01-26 15:35:06.303 183407 DEBUG nova.network.neutron [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 15:35:06 compute-1 nova_compute[183403]: 2026-01-26 15:35:06.812 183407 WARNING neutronclient.v2_0.client [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:35:06 compute-1 nova_compute[183403]: 2026-01-26 15:35:06.987 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:35:07 compute-1 nova_compute[183403]: 2026-01-26 15:35:07.481 183407 WARNING neutronclient.v2_0.client [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:35:07 compute-1 nova_compute[183403]: 2026-01-26 15:35:07.653 183407 DEBUG nova.network.neutron [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee] Updating instance_info_cache with network_info: [{"id": "abfe8595-6f38-41fe-a0cb-eeaa34a05633", "address": "fa:16:3e:47:95:ef", "network": {"id": "cc98e8b1-8169-4a08-8b22-cd8a87c017a0", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-1789164558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a18d09bfe0e7479c8a237dd032889317", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapabfe8595-6f", "ovs_interfaceid": "abfe8595-6f38-41fe-a0cb-eeaa34a05633", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:35:08 compute-1 nova_compute[183403]: 2026-01-26 15:35:08.161 183407 DEBUG oslo_concurrency.lockutils [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Releasing lock "refresh_cache-ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 15:35:08 compute-1 nova_compute[183403]: 2026-01-26 15:35:08.177 183407 DEBUG nova.virt.libvirt.driver [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpcyraiacu',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Jan 26 15:35:08 compute-1 nova_compute[183403]: 2026-01-26 15:35:08.178 183407 DEBUG nova.virt.libvirt.driver [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee] Creating instance directory: /var/lib/nova/instances/ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Jan 26 15:35:08 compute-1 nova_compute[183403]: 2026-01-26 15:35:08.178 183407 DEBUG nova.virt.libvirt.driver [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee] Creating disk.info with the contents: {'/var/lib/nova/instances/ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee/disk': 'qcow2', '/var/lib/nova/instances/ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Jan 26 15:35:08 compute-1 nova_compute[183403]: 2026-01-26 15:35:08.179 183407 DEBUG nova.virt.libvirt.driver [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Jan 26 15:35:08 compute-1 nova_compute[183403]: 2026-01-26 15:35:08.179 183407 DEBUG nova.objects.instance [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lazy-loading 'trusted_certs' on Instance uuid ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:35:08 compute-1 nova_compute[183403]: 2026-01-26 15:35:08.686 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:35:08 compute-1 nova_compute[183403]: 2026-01-26 15:35:08.691 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:35:08 compute-1 nova_compute[183403]: 2026-01-26 15:35:08.693 183407 DEBUG oslo_concurrency.processutils [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:35:08 compute-1 nova_compute[183403]: 2026-01-26 15:35:08.769 183407 DEBUG oslo_concurrency.processutils [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:35:08 compute-1 nova_compute[183403]: 2026-01-26 15:35:08.770 183407 DEBUG oslo_concurrency.lockutils [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:35:08 compute-1 nova_compute[183403]: 2026-01-26 15:35:08.771 183407 DEBUG oslo_concurrency.lockutils [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:35:08 compute-1 nova_compute[183403]: 2026-01-26 15:35:08.771 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:35:08 compute-1 nova_compute[183403]: 2026-01-26 15:35:08.776 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:35:08 compute-1 nova_compute[183403]: 2026-01-26 15:35:08.776 183407 DEBUG oslo_concurrency.processutils [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:35:08 compute-1 nova_compute[183403]: 2026-01-26 15:35:08.854 183407 DEBUG oslo_concurrency.processutils [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:35:08 compute-1 nova_compute[183403]: 2026-01-26 15:35:08.855 183407 DEBUG oslo_concurrency.processutils [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0,backing_fmt=raw /var/lib/nova/instances/ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:35:09 compute-1 nova_compute[183403]: 2026-01-26 15:35:09.226 183407 DEBUG oslo_concurrency.processutils [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0,backing_fmt=raw /var/lib/nova/instances/ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee/disk 1073741824" returned: 0 in 0.371s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:35:09 compute-1 nova_compute[183403]: 2026-01-26 15:35:09.227 183407 DEBUG oslo_concurrency.lockutils [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.457s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:35:09 compute-1 nova_compute[183403]: 2026-01-26 15:35:09.228 183407 DEBUG oslo_concurrency.processutils [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:35:09 compute-1 nova_compute[183403]: 2026-01-26 15:35:09.313 183407 DEBUG oslo_concurrency.processutils [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:35:09 compute-1 nova_compute[183403]: 2026-01-26 15:35:09.314 183407 DEBUG nova.virt.disk.api [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Checking if we can resize image /var/lib/nova/instances/ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 26 15:35:09 compute-1 nova_compute[183403]: 2026-01-26 15:35:09.315 183407 DEBUG oslo_concurrency.processutils [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:35:09 compute-1 nova_compute[183403]: 2026-01-26 15:35:09.374 183407 DEBUG oslo_concurrency.processutils [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:35:09 compute-1 nova_compute[183403]: 2026-01-26 15:35:09.375 183407 DEBUG nova.virt.disk.api [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Cannot resize image /var/lib/nova/instances/ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 26 15:35:09 compute-1 nova_compute[183403]: 2026-01-26 15:35:09.375 183407 DEBUG nova.objects.instance [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lazy-loading 'migration_context' on Instance uuid ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:35:09 compute-1 nova_compute[183403]: 2026-01-26 15:35:09.731 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:35:09 compute-1 nova_compute[183403]: 2026-01-26 15:35:09.884 183407 DEBUG nova.objects.base [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Object Instance<ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Jan 26 15:35:09 compute-1 nova_compute[183403]: 2026-01-26 15:35:09.885 183407 DEBUG oslo_concurrency.processutils [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:35:09 compute-1 nova_compute[183403]: 2026-01-26 15:35:09.911 183407 DEBUG oslo_concurrency.processutils [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee/disk.config 497664" returned: 0 in 0.026s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:35:09 compute-1 nova_compute[183403]: 2026-01-26 15:35:09.912 183407 DEBUG nova.virt.libvirt.driver [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Jan 26 15:35:09 compute-1 nova_compute[183403]: 2026-01-26 15:35:09.913 183407 DEBUG nova.virt.libvirt.vif [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-26T15:33:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-624008049',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-6240080',id=26,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T15:33:38Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='8f7ddd8ab2ae4841a1f43ae8078bb924',ramdisk_id='',reservation_id='r-s0fi6kdn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,reader,admin,member',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-1156482628',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-1156482628-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T15:33:38Z,user_data=None,user_id='8152e350b54f44cabafc751c752d6f92',uuid=ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "abfe8595-6f38-41fe-a0cb-eeaa34a05633", "address": "fa:16:3e:47:95:ef", "network": {"id": "cc98e8b1-8169-4a08-8b22-cd8a87c017a0", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-1789164558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a18d09bfe0e7479c8a237dd032889317", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapabfe8595-6f", "ovs_interfaceid": "abfe8595-6f38-41fe-a0cb-eeaa34a05633", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 26 15:35:09 compute-1 nova_compute[183403]: 2026-01-26 15:35:09.914 183407 DEBUG nova.network.os_vif_util [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Converting VIF {"id": "abfe8595-6f38-41fe-a0cb-eeaa34a05633", "address": "fa:16:3e:47:95:ef", "network": {"id": "cc98e8b1-8169-4a08-8b22-cd8a87c017a0", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-1789164558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a18d09bfe0e7479c8a237dd032889317", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapabfe8595-6f", "ovs_interfaceid": "abfe8595-6f38-41fe-a0cb-eeaa34a05633", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:35:09 compute-1 nova_compute[183403]: 2026-01-26 15:35:09.915 183407 DEBUG nova.network.os_vif_util [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:47:95:ef,bridge_name='br-int',has_traffic_filtering=True,id=abfe8595-6f38-41fe-a0cb-eeaa34a05633,network=Network(cc98e8b1-8169-4a08-8b22-cd8a87c017a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapabfe8595-6f') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:35:09 compute-1 nova_compute[183403]: 2026-01-26 15:35:09.915 183407 DEBUG os_vif [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:47:95:ef,bridge_name='br-int',has_traffic_filtering=True,id=abfe8595-6f38-41fe-a0cb-eeaa34a05633,network=Network(cc98e8b1-8169-4a08-8b22-cd8a87c017a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapabfe8595-6f') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 26 15:35:09 compute-1 nova_compute[183403]: 2026-01-26 15:35:09.916 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:35:09 compute-1 nova_compute[183403]: 2026-01-26 15:35:09.917 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:35:09 compute-1 nova_compute[183403]: 2026-01-26 15:35:09.917 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:35:09 compute-1 nova_compute[183403]: 2026-01-26 15:35:09.918 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:35:09 compute-1 nova_compute[183403]: 2026-01-26 15:35:09.918 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '0d1424ce-0379-5765-a9bf-a9e7459a591a', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:35:09 compute-1 nova_compute[183403]: 2026-01-26 15:35:09.920 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:35:09 compute-1 nova_compute[183403]: 2026-01-26 15:35:09.921 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:35:09 compute-1 nova_compute[183403]: 2026-01-26 15:35:09.926 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:35:09 compute-1 nova_compute[183403]: 2026-01-26 15:35:09.927 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapabfe8595-6f, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:35:09 compute-1 nova_compute[183403]: 2026-01-26 15:35:09.927 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapabfe8595-6f, col_values=(('qos', UUID('a13d8cfe-76fd-4254-a716-57bb36bfc5ab')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:35:09 compute-1 nova_compute[183403]: 2026-01-26 15:35:09.928 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapabfe8595-6f, col_values=(('external_ids', {'iface-id': 'abfe8595-6f38-41fe-a0cb-eeaa34a05633', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:47:95:ef', 'vm-uuid': 'ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:35:09 compute-1 NetworkManager[55716]: <info>  [1769441709.9298] manager: (tapabfe8595-6f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/79)
Jan 26 15:35:09 compute-1 nova_compute[183403]: 2026-01-26 15:35:09.929 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:35:09 compute-1 nova_compute[183403]: 2026-01-26 15:35:09.932 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:35:09 compute-1 nova_compute[183403]: 2026-01-26 15:35:09.937 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:35:09 compute-1 nova_compute[183403]: 2026-01-26 15:35:09.938 183407 INFO os_vif [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:47:95:ef,bridge_name='br-int',has_traffic_filtering=True,id=abfe8595-6f38-41fe-a0cb-eeaa34a05633,network=Network(cc98e8b1-8169-4a08-8b22-cd8a87c017a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapabfe8595-6f')
Jan 26 15:35:09 compute-1 nova_compute[183403]: 2026-01-26 15:35:09.938 183407 DEBUG nova.virt.libvirt.driver [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Jan 26 15:35:09 compute-1 nova_compute[183403]: 2026-01-26 15:35:09.939 183407 DEBUG nova.compute.manager [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpcyraiacu',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9381
Jan 26 15:35:09 compute-1 nova_compute[183403]: 2026-01-26 15:35:09.940 183407 WARNING neutronclient.v2_0.client [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:35:10 compute-1 nova_compute[183403]: 2026-01-26 15:35:10.284 183407 WARNING neutronclient.v2_0.client [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:35:10 compute-1 nova_compute[183403]: 2026-01-26 15:35:10.938 183407 DEBUG nova.network.neutron [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee] Port abfe8595-6f38-41fe-a0cb-eeaa34a05633 updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Jan 26 15:35:10 compute-1 nova_compute[183403]: 2026-01-26 15:35:10.953 183407 DEBUG nova.compute.manager [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpcyraiacu',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9447
Jan 26 15:35:11 compute-1 nova_compute[183403]: 2026-01-26 15:35:11.990 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:35:12 compute-1 podman[213561]: 2026-01-26 15:35:12.929959756 +0000 UTC m=+0.107529613 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible)
Jan 26 15:35:12 compute-1 podman[213560]: 2026-01-26 15:35:12.940043268 +0000 UTC m=+0.112479456 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 26 15:35:14 compute-1 NetworkManager[55716]: <info>  [1769441714.4486] manager: (tapabfe8595-6f): new Tun device (/org/freedesktop/NetworkManager/Devices/80)
Jan 26 15:35:14 compute-1 kernel: tapabfe8595-6f: entered promiscuous mode
Jan 26 15:35:14 compute-1 nova_compute[183403]: 2026-01-26 15:35:14.453 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:35:14 compute-1 ovn_controller[95641]: 2026-01-26T15:35:14Z|00202|binding|INFO|Claiming lport abfe8595-6f38-41fe-a0cb-eeaa34a05633 for this additional chassis.
Jan 26 15:35:14 compute-1 ovn_controller[95641]: 2026-01-26T15:35:14Z|00203|binding|INFO|abfe8595-6f38-41fe-a0cb-eeaa34a05633: Claiming fa:16:3e:47:95:ef 10.100.0.7
Jan 26 15:35:14 compute-1 nova_compute[183403]: 2026-01-26 15:35:14.461 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:35:14 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:35:14.477 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:47:95:ef 10.100.0.7'], port_security=['fa:16:3e:47:95:ef 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc98e8b1-8169-4a08-8b22-cd8a87c017a0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f7ddd8ab2ae4841a1f43ae8078bb924', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'c563489a-e307-4381-b22d-8f22c6dbbfd6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=76661f47-b7c7-4131-9a1a-0f8828404115, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=abfe8595-6f38-41fe-a0cb-eeaa34a05633) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:35:14 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:35:14.479 104930 INFO neutron.agent.ovn.metadata.agent [-] Port abfe8595-6f38-41fe-a0cb-eeaa34a05633 in datapath cc98e8b1-8169-4a08-8b22-cd8a87c017a0 unbound from our chassis
Jan 26 15:35:14 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:35:14.480 104930 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network cc98e8b1-8169-4a08-8b22-cd8a87c017a0
Jan 26 15:35:14 compute-1 systemd-udevd[213619]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 15:35:14 compute-1 nova_compute[183403]: 2026-01-26 15:35:14.484 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:35:14 compute-1 ovn_controller[95641]: 2026-01-26T15:35:14Z|00204|binding|INFO|Setting lport abfe8595-6f38-41fe-a0cb-eeaa34a05633 ovn-installed in OVS
Jan 26 15:35:14 compute-1 nova_compute[183403]: 2026-01-26 15:35:14.486 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:35:14 compute-1 NetworkManager[55716]: <info>  [1769441714.4988] device (tapabfe8595-6f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 15:35:14 compute-1 NetworkManager[55716]: <info>  [1769441714.4997] device (tapabfe8595-6f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 15:35:14 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:35:14.500 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[365ffdf8-1848-449c-927a-5f89cddb0e45]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:35:14 compute-1 systemd-machined[154697]: New machine qemu-19-instance-0000001a.
Jan 26 15:35:14 compute-1 systemd[1]: Started Virtual Machine qemu-19-instance-0000001a.
Jan 26 15:35:14 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:35:14.531 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[58340df2-46e1-48ff-9834-2b13e82ba484]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:35:14 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:35:14.534 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[0dbf90c9-8483-446e-9113-b000229935e8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:35:14 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:35:14.571 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[869b9ced-e38b-4bb4-b223-10f5300aead1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:35:14 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:35:14.591 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[8d5fb668-f24b-4a96-82cd-3ddfd93a3332]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcc98e8b1-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:92:f7:3a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 26, 'tx_packets': 5, 'rx_bytes': 1372, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 26, 'tx_packets': 5, 'rx_bytes': 1372, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 540728, 'reachable_time': 21508, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213635, 'error': None, 'target': 'ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:35:14 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:35:14.617 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[49db0916-ee9d-4c1d-9b90-16822b6c9a5f]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapcc98e8b1-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 540741, 'tstamp': 540741}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213637, 'error': None, 'target': 'ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapcc98e8b1-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 540746, 'tstamp': 540746}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213637, 'error': None, 'target': 'ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:35:14 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:35:14.619 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcc98e8b1-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:35:14 compute-1 nova_compute[183403]: 2026-01-26 15:35:14.621 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:35:14 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:35:14.622 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcc98e8b1-80, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:35:14 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:35:14.622 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:35:14 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:35:14.623 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapcc98e8b1-80, col_values=(('external_ids', {'iface-id': '219edf33-3765-4d7c-87b5-4ab0ed1d6a8a'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:35:14 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:35:14.623 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:35:14 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:35:14.624 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[6bbe47a4-54e0-43fd-9685-b4f7fdebed9f]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-cc98e8b1-8169-4a08-8b22-cd8a87c017a0\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/cc98e8b1-8169-4a08-8b22-cd8a87c017a0.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID cc98e8b1-8169-4a08-8b22-cd8a87c017a0\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:35:14 compute-1 nova_compute[183403]: 2026-01-26 15:35:14.929 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:35:17 compute-1 nova_compute[183403]: 2026-01-26 15:35:17.040 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:35:17 compute-1 ovn_controller[95641]: 2026-01-26T15:35:17Z|00205|binding|INFO|Claiming lport abfe8595-6f38-41fe-a0cb-eeaa34a05633 for this chassis.
Jan 26 15:35:17 compute-1 ovn_controller[95641]: 2026-01-26T15:35:17Z|00206|binding|INFO|abfe8595-6f38-41fe-a0cb-eeaa34a05633: Claiming fa:16:3e:47:95:ef 10.100.0.7
Jan 26 15:35:17 compute-1 ovn_controller[95641]: 2026-01-26T15:35:17Z|00207|binding|INFO|Setting lport abfe8595-6f38-41fe-a0cb-eeaa34a05633 up in Southbound
Jan 26 15:35:18 compute-1 nova_compute[183403]: 2026-01-26 15:35:18.984 183407 INFO nova.compute.manager [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee] Post operation of migration started
Jan 26 15:35:18 compute-1 nova_compute[183403]: 2026-01-26 15:35:18.985 183407 WARNING neutronclient.v2_0.client [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:35:19 compute-1 nova_compute[183403]: 2026-01-26 15:35:19.277 183407 WARNING neutronclient.v2_0.client [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:35:19 compute-1 nova_compute[183403]: 2026-01-26 15:35:19.278 183407 WARNING neutronclient.v2_0.client [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:35:19 compute-1 nova_compute[183403]: 2026-01-26 15:35:19.344 183407 DEBUG oslo_concurrency.lockutils [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "refresh_cache-ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 15:35:19 compute-1 nova_compute[183403]: 2026-01-26 15:35:19.345 183407 DEBUG oslo_concurrency.lockutils [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquired lock "refresh_cache-ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 15:35:19 compute-1 nova_compute[183403]: 2026-01-26 15:35:19.345 183407 DEBUG nova.network.neutron [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 15:35:19 compute-1 openstack_network_exporter[195610]: ERROR   15:35:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:35:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:35:19 compute-1 openstack_network_exporter[195610]: ERROR   15:35:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:35:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:35:19 compute-1 nova_compute[183403]: 2026-01-26 15:35:19.852 183407 WARNING neutronclient.v2_0.client [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:35:19 compute-1 nova_compute[183403]: 2026-01-26 15:35:19.931 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:35:20 compute-1 nova_compute[183403]: 2026-01-26 15:35:20.341 183407 WARNING neutronclient.v2_0.client [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:35:20 compute-1 nova_compute[183403]: 2026-01-26 15:35:20.519 183407 DEBUG nova.network.neutron [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee] Updating instance_info_cache with network_info: [{"id": "abfe8595-6f38-41fe-a0cb-eeaa34a05633", "address": "fa:16:3e:47:95:ef", "network": {"id": "cc98e8b1-8169-4a08-8b22-cd8a87c017a0", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-1789164558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a18d09bfe0e7479c8a237dd032889317", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapabfe8595-6f", "ovs_interfaceid": "abfe8595-6f38-41fe-a0cb-eeaa34a05633", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:35:20 compute-1 nova_compute[183403]: 2026-01-26 15:35:20.575 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:35:21 compute-1 nova_compute[183403]: 2026-01-26 15:35:21.029 183407 DEBUG oslo_concurrency.lockutils [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Releasing lock "refresh_cache-ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 15:35:21 compute-1 nova_compute[183403]: 2026-01-26 15:35:21.558 183407 DEBUG oslo_concurrency.lockutils [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:35:21 compute-1 nova_compute[183403]: 2026-01-26 15:35:21.559 183407 DEBUG oslo_concurrency.lockutils [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:35:21 compute-1 nova_compute[183403]: 2026-01-26 15:35:21.559 183407 DEBUG oslo_concurrency.lockutils [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:35:21 compute-1 nova_compute[183403]: 2026-01-26 15:35:21.565 183407 INFO nova.virt.libvirt.driver [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Jan 26 15:35:21 compute-1 virtqemud[183290]: Domain id=19 name='instance-0000001a' uuid=ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee is tainted: custom-monitor
Jan 26 15:35:22 compute-1 nova_compute[183403]: 2026-01-26 15:35:22.042 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:35:22 compute-1 nova_compute[183403]: 2026-01-26 15:35:22.578 183407 INFO nova.virt.libvirt.driver [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Jan 26 15:35:23 compute-1 nova_compute[183403]: 2026-01-26 15:35:23.586 183407 INFO nova.virt.libvirt.driver [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Jan 26 15:35:23 compute-1 nova_compute[183403]: 2026-01-26 15:35:23.592 183407 DEBUG nova.compute.manager [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Jan 26 15:35:24 compute-1 nova_compute[183403]: 2026-01-26 15:35:24.102 183407 DEBUG nova.objects.instance [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Jan 26 15:35:24 compute-1 nova_compute[183403]: 2026-01-26 15:35:24.934 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:35:25 compute-1 nova_compute[183403]: 2026-01-26 15:35:25.138 183407 WARNING neutronclient.v2_0.client [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:35:25 compute-1 nova_compute[183403]: 2026-01-26 15:35:25.279 183407 WARNING neutronclient.v2_0.client [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:35:25 compute-1 nova_compute[183403]: 2026-01-26 15:35:25.280 183407 WARNING neutronclient.v2_0.client [None req-a2fd9137-4638-4104-83c1-45c90f1e81fe a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:35:25 compute-1 nova_compute[183403]: 2026-01-26 15:35:25.575 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:35:26 compute-1 nova_compute[183403]: 2026-01-26 15:35:26.575 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:35:27 compute-1 nova_compute[183403]: 2026-01-26 15:35:27.045 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:35:27 compute-1 nova_compute[183403]: 2026-01-26 15:35:27.093 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:35:27 compute-1 nova_compute[183403]: 2026-01-26 15:35:27.093 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:35:27 compute-1 nova_compute[183403]: 2026-01-26 15:35:27.094 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:35:27 compute-1 nova_compute[183403]: 2026-01-26 15:35:27.094 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:35:28 compute-1 nova_compute[183403]: 2026-01-26 15:35:28.154 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4f73a02a-fb26-4967-89d0-d1f3bba3c8cc/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:35:28 compute-1 nova_compute[183403]: 2026-01-26 15:35:28.244 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4f73a02a-fb26-4967-89d0-d1f3bba3c8cc/disk --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:35:28 compute-1 nova_compute[183403]: 2026-01-26 15:35:28.245 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4f73a02a-fb26-4967-89d0-d1f3bba3c8cc/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:35:28 compute-1 nova_compute[183403]: 2026-01-26 15:35:28.310 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4f73a02a-fb26-4967-89d0-d1f3bba3c8cc/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:35:28 compute-1 nova_compute[183403]: 2026-01-26 15:35:28.316 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:35:28 compute-1 nova_compute[183403]: 2026-01-26 15:35:28.397 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee/disk --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:35:28 compute-1 nova_compute[183403]: 2026-01-26 15:35:28.398 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:35:28 compute-1 nova_compute[183403]: 2026-01-26 15:35:28.494 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee/disk --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:35:28 compute-1 nova_compute[183403]: 2026-01-26 15:35:28.697 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:35:28 compute-1 nova_compute[183403]: 2026-01-26 15:35:28.699 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:35:28 compute-1 nova_compute[183403]: 2026-01-26 15:35:28.740 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:35:28 compute-1 nova_compute[183403]: 2026-01-26 15:35:28.740 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5536MB free_disk=73.08686828613281GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:35:28 compute-1 nova_compute[183403]: 2026-01-26 15:35:28.741 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:35:28 compute-1 nova_compute[183403]: 2026-01-26 15:35:28.741 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:35:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:35:29.090 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:35:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:35:29.091 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:35:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:35:29.092 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:35:29 compute-1 nova_compute[183403]: 2026-01-26 15:35:29.669 183407 DEBUG oslo_concurrency.lockutils [None req-19959446-0f4c-4cfa-a957-c89e8f8aa20a 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Acquiring lock "4f73a02a-fb26-4967-89d0-d1f3bba3c8cc" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:35:29 compute-1 nova_compute[183403]: 2026-01-26 15:35:29.670 183407 DEBUG oslo_concurrency.lockutils [None req-19959446-0f4c-4cfa-a957-c89e8f8aa20a 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Lock "4f73a02a-fb26-4967-89d0-d1f3bba3c8cc" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:35:29 compute-1 nova_compute[183403]: 2026-01-26 15:35:29.670 183407 DEBUG oslo_concurrency.lockutils [None req-19959446-0f4c-4cfa-a957-c89e8f8aa20a 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Acquiring lock "4f73a02a-fb26-4967-89d0-d1f3bba3c8cc-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:35:29 compute-1 nova_compute[183403]: 2026-01-26 15:35:29.671 183407 DEBUG oslo_concurrency.lockutils [None req-19959446-0f4c-4cfa-a957-c89e8f8aa20a 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Lock "4f73a02a-fb26-4967-89d0-d1f3bba3c8cc-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:35:29 compute-1 nova_compute[183403]: 2026-01-26 15:35:29.672 183407 DEBUG oslo_concurrency.lockutils [None req-19959446-0f4c-4cfa-a957-c89e8f8aa20a 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Lock "4f73a02a-fb26-4967-89d0-d1f3bba3c8cc-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:35:29 compute-1 nova_compute[183403]: 2026-01-26 15:35:29.687 183407 INFO nova.compute.manager [None req-19959446-0f4c-4cfa-a957-c89e8f8aa20a 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] [instance: 4f73a02a-fb26-4967-89d0-d1f3bba3c8cc] Terminating instance
Jan 26 15:35:29 compute-1 nova_compute[183403]: 2026-01-26 15:35:29.765 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Applying migration context for instance ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee as it has an incoming, in-progress migration f6b2ddea-e1cd-44ca-97b5-bb91517c7f44. Migration status is running _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1046
Jan 26 15:35:29 compute-1 nova_compute[183403]: 2026-01-26 15:35:29.766 183407 DEBUG nova.objects.instance [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] [instance: ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Jan 26 15:35:29 compute-1 nova_compute[183403]: 2026-01-26 15:35:29.937 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:35:30 compute-1 nova_compute[183403]: 2026-01-26 15:35:30.207 183407 DEBUG nova.compute.manager [None req-19959446-0f4c-4cfa-a957-c89e8f8aa20a 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] [instance: 4f73a02a-fb26-4967-89d0-d1f3bba3c8cc] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Jan 26 15:35:30 compute-1 kernel: tap3572491e-cb (unregistering): left promiscuous mode
Jan 26 15:35:30 compute-1 NetworkManager[55716]: <info>  [1769441730.5536] device (tap3572491e-cb): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 15:35:30 compute-1 ovn_controller[95641]: 2026-01-26T15:35:30Z|00208|binding|INFO|Releasing lport 3572491e-cb7e-4f70-b828-2f4d9fda6e48 from this chassis (sb_readonly=0)
Jan 26 15:35:30 compute-1 ovn_controller[95641]: 2026-01-26T15:35:30Z|00209|binding|INFO|Setting lport 3572491e-cb7e-4f70-b828-2f4d9fda6e48 down in Southbound
Jan 26 15:35:30 compute-1 nova_compute[183403]: 2026-01-26 15:35:30.561 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:35:30 compute-1 ovn_controller[95641]: 2026-01-26T15:35:30Z|00210|binding|INFO|Removing iface tap3572491e-cb ovn-installed in OVS
Jan 26 15:35:30 compute-1 nova_compute[183403]: 2026-01-26 15:35:30.566 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:35:30 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:35:30.569 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8c:e6:21 10.100.0.9'], port_security=['fa:16:3e:8c:e6:21 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '4f73a02a-fb26-4967-89d0-d1f3bba3c8cc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc98e8b1-8169-4a08-8b22-cd8a87c017a0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f7ddd8ab2ae4841a1f43ae8078bb924', 'neutron:revision_number': '16', 'neutron:security_group_ids': 'c563489a-e307-4381-b22d-8f22c6dbbfd6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=76661f47-b7c7-4131-9a1a-0f8828404115, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], logical_port=3572491e-cb7e-4f70-b828-2f4d9fda6e48) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:35:30 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:35:30.571 104930 INFO neutron.agent.ovn.metadata.agent [-] Port 3572491e-cb7e-4f70-b828-2f4d9fda6e48 in datapath cc98e8b1-8169-4a08-8b22-cd8a87c017a0 unbound from our chassis
Jan 26 15:35:30 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:35:30.572 104930 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network cc98e8b1-8169-4a08-8b22-cd8a87c017a0
Jan 26 15:35:30 compute-1 nova_compute[183403]: 2026-01-26 15:35:30.590 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:35:30 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:35:30.596 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[ae7778ba-52a0-48a9-9eed-32fc567c5dc5]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:35:30 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:35:30.637 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[4158d3a3-17dd-4ce0-8d4c-dd02553408c3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:35:30 compute-1 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d0000001b.scope: Deactivated successfully.
Jan 26 15:35:30 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:35:30.642 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[9f622978-8d42-41e2-b079-02f16e31d832]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:35:30 compute-1 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d0000001b.scope: Consumed 4.567s CPU time.
Jan 26 15:35:30 compute-1 systemd-machined[154697]: Machine qemu-18-instance-0000001b terminated.
Jan 26 15:35:30 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:35:30.673 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[83b66df4-c436-4a9a-b838-150d2e56be5b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:35:30 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:35:30.691 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[df4fdcea-386c-4ec4-9a5d-ddc4c9ee1eb7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcc98e8b1-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:92:f7:3a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 46, 'tx_packets': 7, 'rx_bytes': 2212, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 46, 'tx_packets': 7, 'rx_bytes': 2212, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 540728, 'reachable_time': 21508, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213679, 'error': None, 'target': 'ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:35:30 compute-1 nova_compute[183403]: 2026-01-26 15:35:30.700 183407 DEBUG nova.compute.manager [req-bd8c97f9-5217-4ef5-b32b-b1f5ee99bbc6 req-f86b5cf0-c0db-45af-9210-555e21cbfaab 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 4f73a02a-fb26-4967-89d0-d1f3bba3c8cc] Received event network-vif-unplugged-3572491e-cb7e-4f70-b828-2f4d9fda6e48 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:35:30 compute-1 nova_compute[183403]: 2026-01-26 15:35:30.701 183407 DEBUG oslo_concurrency.lockutils [req-bd8c97f9-5217-4ef5-b32b-b1f5ee99bbc6 req-f86b5cf0-c0db-45af-9210-555e21cbfaab 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "4f73a02a-fb26-4967-89d0-d1f3bba3c8cc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:35:30 compute-1 nova_compute[183403]: 2026-01-26 15:35:30.701 183407 DEBUG oslo_concurrency.lockutils [req-bd8c97f9-5217-4ef5-b32b-b1f5ee99bbc6 req-f86b5cf0-c0db-45af-9210-555e21cbfaab 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "4f73a02a-fb26-4967-89d0-d1f3bba3c8cc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:35:30 compute-1 nova_compute[183403]: 2026-01-26 15:35:30.701 183407 DEBUG oslo_concurrency.lockutils [req-bd8c97f9-5217-4ef5-b32b-b1f5ee99bbc6 req-f86b5cf0-c0db-45af-9210-555e21cbfaab 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "4f73a02a-fb26-4967-89d0-d1f3bba3c8cc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:35:30 compute-1 nova_compute[183403]: 2026-01-26 15:35:30.702 183407 DEBUG nova.compute.manager [req-bd8c97f9-5217-4ef5-b32b-b1f5ee99bbc6 req-f86b5cf0-c0db-45af-9210-555e21cbfaab 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 4f73a02a-fb26-4967-89d0-d1f3bba3c8cc] No waiting events found dispatching network-vif-unplugged-3572491e-cb7e-4f70-b828-2f4d9fda6e48 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:35:30 compute-1 nova_compute[183403]: 2026-01-26 15:35:30.702 183407 DEBUG nova.compute.manager [req-bd8c97f9-5217-4ef5-b32b-b1f5ee99bbc6 req-f86b5cf0-c0db-45af-9210-555e21cbfaab 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 4f73a02a-fb26-4967-89d0-d1f3bba3c8cc] Received event network-vif-unplugged-3572491e-cb7e-4f70-b828-2f4d9fda6e48 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 15:35:30 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:35:30.710 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[1800f99d-6d43-4c2c-9fdd-0b9a6ba4ea7e]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapcc98e8b1-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 540741, 'tstamp': 540741}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213680, 'error': None, 'target': 'ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapcc98e8b1-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 540746, 'tstamp': 540746}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213680, 'error': None, 'target': 'ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:35:30 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:35:30.712 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcc98e8b1-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:35:30 compute-1 nova_compute[183403]: 2026-01-26 15:35:30.790 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] [instance: ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Jan 26 15:35:30 compute-1 nova_compute[183403]: 2026-01-26 15:35:30.824 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:35:30 compute-1 nova_compute[183403]: 2026-01-26 15:35:30.826 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Instance 4f73a02a-fb26-4967-89d0-d1f3bba3c8cc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 15:35:30 compute-1 nova_compute[183403]: 2026-01-26 15:35:30.826 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Instance ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 15:35:30 compute-1 nova_compute[183403]: 2026-01-26 15:35:30.826 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:35:30 compute-1 nova_compute[183403]: 2026-01-26 15:35:30.827 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:35:28 up  1:30,  0 user,  load average: 0.22, 0.17, 0.21\n', 'num_instances': '2', 'num_vm_active': '2', 'num_task_None': '2', 'num_os_type_None': '2', 'num_proj_8f7ddd8ab2ae4841a1f43ae8078bb924': '2', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:35:30 compute-1 nova_compute[183403]: 2026-01-26 15:35:30.831 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:35:30 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:35:30.831 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcc98e8b1-80, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:35:30 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:35:30.832 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:35:30 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:35:30.832 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapcc98e8b1-80, col_values=(('external_ids', {'iface-id': '219edf33-3765-4d7c-87b5-4ab0ed1d6a8a'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:35:30 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:35:30.832 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:35:30 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:35:30.834 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[e32b50b1-d8be-4861-b561-845c189688f0]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-cc98e8b1-8169-4a08-8b22-cd8a87c017a0\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/cc98e8b1-8169-4a08-8b22-cd8a87c017a0.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID cc98e8b1-8169-4a08-8b22-cd8a87c017a0\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:35:30 compute-1 nova_compute[183403]: 2026-01-26 15:35:30.883 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:35:30 compute-1 nova_compute[183403]: 2026-01-26 15:35:30.889 183407 INFO nova.virt.libvirt.driver [-] [instance: 4f73a02a-fb26-4967-89d0-d1f3bba3c8cc] Instance destroyed successfully.
Jan 26 15:35:30 compute-1 nova_compute[183403]: 2026-01-26 15:35:30.890 183407 DEBUG nova.objects.instance [None req-19959446-0f4c-4cfa-a957-c89e8f8aa20a 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Lazy-loading 'resources' on Instance uuid 4f73a02a-fb26-4967-89d0-d1f3bba3c8cc obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:35:31 compute-1 nova_compute[183403]: 2026-01-26 15:35:31.393 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:35:31 compute-1 nova_compute[183403]: 2026-01-26 15:35:31.401 183407 DEBUG nova.virt.libvirt.vif [None req-19959446-0f4c-4cfa-a957-c89e8f8aa20a 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2026-01-26T15:33:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-548888713',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-5488887',id=27,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T15:33:57Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8f7ddd8ab2ae4841a1f43ae8078bb924',ramdisk_id='',reservation_id='r-328l48ai',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,reader,admin,member',clean_attempts='1',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-1156482628',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-1156482628-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T15:34:55Z,user_data=None,user_id='8152e350b54f44cabafc751c752d6f92',uuid=4f73a02a-fb26-4967-89d0-d1f3bba3c8cc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3572491e-cb7e-4f70-b828-2f4d9fda6e48", "address": "fa:16:3e:8c:e6:21", "network": {"id": "cc98e8b1-8169-4a08-8b22-cd8a87c017a0", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-1789164558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a18d09bfe0e7479c8a237dd032889317", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3572491e-cb", "ovs_interfaceid": "3572491e-cb7e-4f70-b828-2f4d9fda6e48", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 26 15:35:31 compute-1 nova_compute[183403]: 2026-01-26 15:35:31.402 183407 DEBUG nova.network.os_vif_util [None req-19959446-0f4c-4cfa-a957-c89e8f8aa20a 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Converting VIF {"id": "3572491e-cb7e-4f70-b828-2f4d9fda6e48", "address": "fa:16:3e:8c:e6:21", "network": {"id": "cc98e8b1-8169-4a08-8b22-cd8a87c017a0", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-1789164558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a18d09bfe0e7479c8a237dd032889317", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3572491e-cb", "ovs_interfaceid": "3572491e-cb7e-4f70-b828-2f4d9fda6e48", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:35:31 compute-1 nova_compute[183403]: 2026-01-26 15:35:31.404 183407 DEBUG nova.network.os_vif_util [None req-19959446-0f4c-4cfa-a957-c89e8f8aa20a 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8c:e6:21,bridge_name='br-int',has_traffic_filtering=True,id=3572491e-cb7e-4f70-b828-2f4d9fda6e48,network=Network(cc98e8b1-8169-4a08-8b22-cd8a87c017a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3572491e-cb') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:35:31 compute-1 nova_compute[183403]: 2026-01-26 15:35:31.405 183407 DEBUG os_vif [None req-19959446-0f4c-4cfa-a957-c89e8f8aa20a 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:8c:e6:21,bridge_name='br-int',has_traffic_filtering=True,id=3572491e-cb7e-4f70-b828-2f4d9fda6e48,network=Network(cc98e8b1-8169-4a08-8b22-cd8a87c017a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3572491e-cb') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 26 15:35:31 compute-1 nova_compute[183403]: 2026-01-26 15:35:31.409 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:35:31 compute-1 nova_compute[183403]: 2026-01-26 15:35:31.410 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3572491e-cb, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:35:31 compute-1 nova_compute[183403]: 2026-01-26 15:35:31.413 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:35:31 compute-1 nova_compute[183403]: 2026-01-26 15:35:31.418 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:35:31 compute-1 nova_compute[183403]: 2026-01-26 15:35:31.418 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:35:31 compute-1 nova_compute[183403]: 2026-01-26 15:35:31.418 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=926436e1-d2fd-4ea5-9470-f5abc42b1cb0) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:35:31 compute-1 nova_compute[183403]: 2026-01-26 15:35:31.419 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:35:31 compute-1 nova_compute[183403]: 2026-01-26 15:35:31.420 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:35:31 compute-1 nova_compute[183403]: 2026-01-26 15:35:31.422 183407 INFO os_vif [None req-19959446-0f4c-4cfa-a957-c89e8f8aa20a 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:8c:e6:21,bridge_name='br-int',has_traffic_filtering=True,id=3572491e-cb7e-4f70-b828-2f4d9fda6e48,network=Network(cc98e8b1-8169-4a08-8b22-cd8a87c017a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3572491e-cb')
Jan 26 15:35:31 compute-1 nova_compute[183403]: 2026-01-26 15:35:31.422 183407 INFO nova.virt.libvirt.driver [None req-19959446-0f4c-4cfa-a957-c89e8f8aa20a 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] [instance: 4f73a02a-fb26-4967-89d0-d1f3bba3c8cc] Deleting instance files /var/lib/nova/instances/4f73a02a-fb26-4967-89d0-d1f3bba3c8cc_del
Jan 26 15:35:31 compute-1 nova_compute[183403]: 2026-01-26 15:35:31.423 183407 INFO nova.virt.libvirt.driver [None req-19959446-0f4c-4cfa-a957-c89e8f8aa20a 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] [instance: 4f73a02a-fb26-4967-89d0-d1f3bba3c8cc] Deletion of /var/lib/nova/instances/4f73a02a-fb26-4967-89d0-d1f3bba3c8cc_del complete
Jan 26 15:35:31 compute-1 nova_compute[183403]: 2026-01-26 15:35:31.913 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:35:31 compute-1 nova_compute[183403]: 2026-01-26 15:35:31.913 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.172s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:35:31 compute-1 nova_compute[183403]: 2026-01-26 15:35:31.933 183407 INFO nova.compute.manager [None req-19959446-0f4c-4cfa-a957-c89e8f8aa20a 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] [instance: 4f73a02a-fb26-4967-89d0-d1f3bba3c8cc] Took 1.73 seconds to destroy the instance on the hypervisor.
Jan 26 15:35:31 compute-1 nova_compute[183403]: 2026-01-26 15:35:31.933 183407 DEBUG oslo.service.backend._eventlet.loopingcall [None req-19959446-0f4c-4cfa-a957-c89e8f8aa20a 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Jan 26 15:35:31 compute-1 nova_compute[183403]: 2026-01-26 15:35:31.933 183407 DEBUG nova.compute.manager [-] [instance: 4f73a02a-fb26-4967-89d0-d1f3bba3c8cc] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Jan 26 15:35:31 compute-1 nova_compute[183403]: 2026-01-26 15:35:31.934 183407 DEBUG nova.network.neutron [-] [instance: 4f73a02a-fb26-4967-89d0-d1f3bba3c8cc] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Jan 26 15:35:31 compute-1 nova_compute[183403]: 2026-01-26 15:35:31.934 183407 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:35:32 compute-1 nova_compute[183403]: 2026-01-26 15:35:32.048 183407 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:35:32 compute-1 nova_compute[183403]: 2026-01-26 15:35:32.050 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:35:32 compute-1 nova_compute[183403]: 2026-01-26 15:35:32.453 183407 DEBUG nova.compute.manager [req-400de7bd-0cb8-47ff-9eed-c5773a24535f req-938c4e31-395d-4d98-a76c-91f5391f23e1 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 4f73a02a-fb26-4967-89d0-d1f3bba3c8cc] Received event network-vif-deleted-3572491e-cb7e-4f70-b828-2f4d9fda6e48 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:35:32 compute-1 nova_compute[183403]: 2026-01-26 15:35:32.454 183407 INFO nova.compute.manager [req-400de7bd-0cb8-47ff-9eed-c5773a24535f req-938c4e31-395d-4d98-a76c-91f5391f23e1 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 4f73a02a-fb26-4967-89d0-d1f3bba3c8cc] Neutron deleted interface 3572491e-cb7e-4f70-b828-2f4d9fda6e48; detaching it from the instance and deleting it from the info cache
Jan 26 15:35:32 compute-1 nova_compute[183403]: 2026-01-26 15:35:32.454 183407 DEBUG nova.network.neutron [req-400de7bd-0cb8-47ff-9eed-c5773a24535f req-938c4e31-395d-4d98-a76c-91f5391f23e1 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 4f73a02a-fb26-4967-89d0-d1f3bba3c8cc] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:35:32 compute-1 nova_compute[183403]: 2026-01-26 15:35:32.791 183407 DEBUG nova.compute.manager [req-6d190c7b-b89b-4914-9701-bf1f285313ea req-b11211e7-0d60-40ca-866c-b67b6ec48291 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 4f73a02a-fb26-4967-89d0-d1f3bba3c8cc] Received event network-vif-unplugged-3572491e-cb7e-4f70-b828-2f4d9fda6e48 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:35:32 compute-1 nova_compute[183403]: 2026-01-26 15:35:32.791 183407 DEBUG oslo_concurrency.lockutils [req-6d190c7b-b89b-4914-9701-bf1f285313ea req-b11211e7-0d60-40ca-866c-b67b6ec48291 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "4f73a02a-fb26-4967-89d0-d1f3bba3c8cc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:35:32 compute-1 nova_compute[183403]: 2026-01-26 15:35:32.791 183407 DEBUG oslo_concurrency.lockutils [req-6d190c7b-b89b-4914-9701-bf1f285313ea req-b11211e7-0d60-40ca-866c-b67b6ec48291 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "4f73a02a-fb26-4967-89d0-d1f3bba3c8cc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:35:32 compute-1 nova_compute[183403]: 2026-01-26 15:35:32.791 183407 DEBUG oslo_concurrency.lockutils [req-6d190c7b-b89b-4914-9701-bf1f285313ea req-b11211e7-0d60-40ca-866c-b67b6ec48291 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "4f73a02a-fb26-4967-89d0-d1f3bba3c8cc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:35:32 compute-1 nova_compute[183403]: 2026-01-26 15:35:32.791 183407 DEBUG nova.compute.manager [req-6d190c7b-b89b-4914-9701-bf1f285313ea req-b11211e7-0d60-40ca-866c-b67b6ec48291 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 4f73a02a-fb26-4967-89d0-d1f3bba3c8cc] No waiting events found dispatching network-vif-unplugged-3572491e-cb7e-4f70-b828-2f4d9fda6e48 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:35:32 compute-1 nova_compute[183403]: 2026-01-26 15:35:32.792 183407 DEBUG nova.compute.manager [req-6d190c7b-b89b-4914-9701-bf1f285313ea req-b11211e7-0d60-40ca-866c-b67b6ec48291 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 4f73a02a-fb26-4967-89d0-d1f3bba3c8cc] Received event network-vif-unplugged-3572491e-cb7e-4f70-b828-2f4d9fda6e48 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 15:35:32 compute-1 nova_compute[183403]: 2026-01-26 15:35:32.836 183407 DEBUG nova.network.neutron [-] [instance: 4f73a02a-fb26-4967-89d0-d1f3bba3c8cc] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:35:32 compute-1 podman[213700]: 2026-01-26 15:35:32.911156617 +0000 UTC m=+0.079297432 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., version=9.6, io.buildah.version=1.33.7, name=ubi9-minimal, vcs-type=git, release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, config_id=openstack_network_exporter)
Jan 26 15:35:32 compute-1 podman[213699]: 2026-01-26 15:35:32.917530209 +0000 UTC m=+0.085555191 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 15:35:32 compute-1 nova_compute[183403]: 2026-01-26 15:35:32.962 183407 DEBUG nova.compute.manager [req-400de7bd-0cb8-47ff-9eed-c5773a24535f req-938c4e31-395d-4d98-a76c-91f5391f23e1 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 4f73a02a-fb26-4967-89d0-d1f3bba3c8cc] Detach interface failed, port_id=3572491e-cb7e-4f70-b828-2f4d9fda6e48, reason: Instance 4f73a02a-fb26-4967-89d0-d1f3bba3c8cc could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Jan 26 15:35:33 compute-1 nova_compute[183403]: 2026-01-26 15:35:33.342 183407 INFO nova.compute.manager [-] [instance: 4f73a02a-fb26-4967-89d0-d1f3bba3c8cc] Took 1.41 seconds to deallocate network for instance.
Jan 26 15:35:33 compute-1 nova_compute[183403]: 2026-01-26 15:35:33.865 183407 DEBUG oslo_concurrency.lockutils [None req-19959446-0f4c-4cfa-a957-c89e8f8aa20a 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:35:33 compute-1 nova_compute[183403]: 2026-01-26 15:35:33.865 183407 DEBUG oslo_concurrency.lockutils [None req-19959446-0f4c-4cfa-a957-c89e8f8aa20a 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:35:33 compute-1 nova_compute[183403]: 2026-01-26 15:35:33.928 183407 DEBUG nova.compute.provider_tree [None req-19959446-0f4c-4cfa-a957-c89e8f8aa20a 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:35:34 compute-1 nova_compute[183403]: 2026-01-26 15:35:34.914 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:35:34 compute-1 nova_compute[183403]: 2026-01-26 15:35:34.915 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:35:34 compute-1 nova_compute[183403]: 2026-01-26 15:35:34.915 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:35:34 compute-1 nova_compute[183403]: 2026-01-26 15:35:34.915 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:35:34 compute-1 nova_compute[183403]: 2026-01-26 15:35:34.916 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:35:34 compute-1 nova_compute[183403]: 2026-01-26 15:35:34.916 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 15:35:34 compute-1 nova_compute[183403]: 2026-01-26 15:35:34.922 183407 DEBUG nova.scheduler.client.report [None req-19959446-0f4c-4cfa-a957-c89e8f8aa20a 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:35:35 compute-1 nova_compute[183403]: 2026-01-26 15:35:35.468 183407 DEBUG oslo_concurrency.lockutils [None req-19959446-0f4c-4cfa-a957-c89e8f8aa20a 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.603s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:35:35 compute-1 nova_compute[183403]: 2026-01-26 15:35:35.491 183407 INFO nova.scheduler.client.report [None req-19959446-0f4c-4cfa-a957-c89e8f8aa20a 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Deleted allocations for instance 4f73a02a-fb26-4967-89d0-d1f3bba3c8cc
Jan 26 15:35:35 compute-1 podman[192725]: time="2026-01-26T15:35:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:35:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:35:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16575 "" "Go-http-client/1.1"
Jan 26 15:35:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:35:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2658 "" "Go-http-client/1.1"
Jan 26 15:35:36 compute-1 nova_compute[183403]: 2026-01-26 15:35:36.419 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:35:36 compute-1 nova_compute[183403]: 2026-01-26 15:35:36.537 183407 DEBUG oslo_concurrency.lockutils [None req-19959446-0f4c-4cfa-a957-c89e8f8aa20a 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Lock "4f73a02a-fb26-4967-89d0-d1f3bba3c8cc" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.867s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:35:37 compute-1 nova_compute[183403]: 2026-01-26 15:35:37.051 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:35:37 compute-1 nova_compute[183403]: 2026-01-26 15:35:37.213 183407 DEBUG oslo_concurrency.lockutils [None req-c93099a5-38cb-46e5-b742-93bbe9328401 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Acquiring lock "ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:35:37 compute-1 nova_compute[183403]: 2026-01-26 15:35:37.214 183407 DEBUG oslo_concurrency.lockutils [None req-c93099a5-38cb-46e5-b742-93bbe9328401 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Lock "ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:35:37 compute-1 nova_compute[183403]: 2026-01-26 15:35:37.215 183407 DEBUG oslo_concurrency.lockutils [None req-c93099a5-38cb-46e5-b742-93bbe9328401 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Acquiring lock "ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:35:37 compute-1 nova_compute[183403]: 2026-01-26 15:35:37.215 183407 DEBUG oslo_concurrency.lockutils [None req-c93099a5-38cb-46e5-b742-93bbe9328401 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Lock "ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:35:37 compute-1 nova_compute[183403]: 2026-01-26 15:35:37.215 183407 DEBUG oslo_concurrency.lockutils [None req-c93099a5-38cb-46e5-b742-93bbe9328401 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Lock "ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:35:37 compute-1 nova_compute[183403]: 2026-01-26 15:35:37.230 183407 INFO nova.compute.manager [None req-c93099a5-38cb-46e5-b742-93bbe9328401 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] [instance: ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee] Terminating instance
Jan 26 15:35:37 compute-1 nova_compute[183403]: 2026-01-26 15:35:37.749 183407 DEBUG nova.compute.manager [None req-c93099a5-38cb-46e5-b742-93bbe9328401 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] [instance: ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Jan 26 15:35:37 compute-1 kernel: tapabfe8595-6f (unregistering): left promiscuous mode
Jan 26 15:35:37 compute-1 NetworkManager[55716]: <info>  [1769441737.7765] device (tapabfe8595-6f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 15:35:37 compute-1 ovn_controller[95641]: 2026-01-26T15:35:37Z|00211|binding|INFO|Releasing lport abfe8595-6f38-41fe-a0cb-eeaa34a05633 from this chassis (sb_readonly=0)
Jan 26 15:35:37 compute-1 ovn_controller[95641]: 2026-01-26T15:35:37Z|00212|binding|INFO|Setting lport abfe8595-6f38-41fe-a0cb-eeaa34a05633 down in Southbound
Jan 26 15:35:37 compute-1 ovn_controller[95641]: 2026-01-26T15:35:37Z|00213|binding|INFO|Removing iface tapabfe8595-6f ovn-installed in OVS
Jan 26 15:35:37 compute-1 nova_compute[183403]: 2026-01-26 15:35:37.784 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:35:37 compute-1 nova_compute[183403]: 2026-01-26 15:35:37.786 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:35:37 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:35:37.791 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:47:95:ef 10.100.0.7'], port_security=['fa:16:3e:47:95:ef 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc98e8b1-8169-4a08-8b22-cd8a87c017a0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f7ddd8ab2ae4841a1f43ae8078bb924', 'neutron:revision_number': '16', 'neutron:security_group_ids': 'c563489a-e307-4381-b22d-8f22c6dbbfd6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=76661f47-b7c7-4131-9a1a-0f8828404115, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], logical_port=abfe8595-6f38-41fe-a0cb-eeaa34a05633) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:35:37 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:35:37.792 104930 INFO neutron.agent.ovn.metadata.agent [-] Port abfe8595-6f38-41fe-a0cb-eeaa34a05633 in datapath cc98e8b1-8169-4a08-8b22-cd8a87c017a0 unbound from our chassis
Jan 26 15:35:37 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:35:37.793 104930 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cc98e8b1-8169-4a08-8b22-cd8a87c017a0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 15:35:37 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:35:37.794 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[2f931b72-e390-4832-b3e3-e5c1ef08789b]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:35:37 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:35:37.794 104930 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0 namespace which is not needed anymore
Jan 26 15:35:37 compute-1 nova_compute[183403]: 2026-01-26 15:35:37.799 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:35:37 compute-1 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d0000001a.scope: Deactivated successfully.
Jan 26 15:35:37 compute-1 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d0000001a.scope: Consumed 2.796s CPU time.
Jan 26 15:35:37 compute-1 systemd-machined[154697]: Machine qemu-19-instance-0000001a terminated.
Jan 26 15:35:37 compute-1 neutron-haproxy-ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0[213470]: [NOTICE]   (213474) : haproxy version is 3.0.5-8e879a5
Jan 26 15:35:37 compute-1 neutron-haproxy-ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0[213470]: [NOTICE]   (213474) : path to executable is /usr/sbin/haproxy
Jan 26 15:35:37 compute-1 neutron-haproxy-ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0[213470]: [WARNING]  (213474) : Exiting Master process...
Jan 26 15:35:37 compute-1 podman[213769]: 2026-01-26 15:35:37.896693045 +0000 UTC m=+0.028751188 container kill 9289b0f3269e50bc6af619a37d18a76412c9d9bf61368da60ac66e8610a8fa39 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120)
Jan 26 15:35:37 compute-1 neutron-haproxy-ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0[213470]: [ALERT]    (213474) : Current worker (213476) exited with code 143 (Terminated)
Jan 26 15:35:37 compute-1 neutron-haproxy-ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0[213470]: [WARNING]  (213474) : All workers exited. Exiting... (0)
Jan 26 15:35:37 compute-1 systemd[1]: libpod-9289b0f3269e50bc6af619a37d18a76412c9d9bf61368da60ac66e8610a8fa39.scope: Deactivated successfully.
Jan 26 15:35:37 compute-1 podman[213784]: 2026-01-26 15:35:37.936293753 +0000 UTC m=+0.023109775 container died 9289b0f3269e50bc6af619a37d18a76412c9d9bf61368da60ac66e8610a8fa39 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team)
Jan 26 15:35:37 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9289b0f3269e50bc6af619a37d18a76412c9d9bf61368da60ac66e8610a8fa39-userdata-shm.mount: Deactivated successfully.
Jan 26 15:35:37 compute-1 systemd[1]: var-lib-containers-storage-overlay-4db4710a879ec7acf1d6973db45a34b9b7305f0972f8ee1099ab3b086690a9aa-merged.mount: Deactivated successfully.
Jan 26 15:35:37 compute-1 nova_compute[183403]: 2026-01-26 15:35:37.967 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:35:37 compute-1 nova_compute[183403]: 2026-01-26 15:35:37.970 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:35:37 compute-1 podman[213784]: 2026-01-26 15:35:37.975752867 +0000 UTC m=+0.062568869 container cleanup 9289b0f3269e50bc6af619a37d18a76412c9d9bf61368da60ac66e8610a8fa39 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0)
Jan 26 15:35:37 compute-1 systemd[1]: libpod-conmon-9289b0f3269e50bc6af619a37d18a76412c9d9bf61368da60ac66e8610a8fa39.scope: Deactivated successfully.
Jan 26 15:35:37 compute-1 podman[213791]: 2026-01-26 15:35:37.992705304 +0000 UTC m=+0.065765351 container remove 9289b0f3269e50bc6af619a37d18a76412c9d9bf61368da60ac66e8610a8fa39 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4)
Jan 26 15:35:37 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:35:37.998 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[9bf02a72-4963-4548-b9a7-cf596fcb232a]: (4, ("Mon Jan 26 03:35:37 PM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0 (9289b0f3269e50bc6af619a37d18a76412c9d9bf61368da60ac66e8610a8fa39)\n9289b0f3269e50bc6af619a37d18a76412c9d9bf61368da60ac66e8610a8fa39\nMon Jan 26 03:35:37 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0 (9289b0f3269e50bc6af619a37d18a76412c9d9bf61368da60ac66e8610a8fa39)\n9289b0f3269e50bc6af619a37d18a76412c9d9bf61368da60ac66e8610a8fa39\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:35:37 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:35:37.999 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[495e19ed-1185-4152-831e-734573e1fb73]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:35:38 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:35:38.000 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/cc98e8b1-8169-4a08-8b22-cd8a87c017a0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/cc98e8b1-8169-4a08-8b22-cd8a87c017a0.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:35:38 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:35:38.000 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[be3c8783-5598-4404-87ee-4288877075e3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:35:38 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:35:38.001 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcc98e8b1-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:35:38 compute-1 nova_compute[183403]: 2026-01-26 15:35:38.003 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:35:38 compute-1 kernel: tapcc98e8b1-80: left promiscuous mode
Jan 26 15:35:38 compute-1 nova_compute[183403]: 2026-01-26 15:35:38.007 183407 INFO nova.virt.libvirt.driver [-] [instance: ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee] Instance destroyed successfully.
Jan 26 15:35:38 compute-1 nova_compute[183403]: 2026-01-26 15:35:38.007 183407 DEBUG nova.objects.instance [None req-c93099a5-38cb-46e5-b742-93bbe9328401 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Lazy-loading 'resources' on Instance uuid ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:35:38 compute-1 nova_compute[183403]: 2026-01-26 15:35:38.018 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:35:38 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:35:38.021 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[d0182bd6-2e43-421b-b9a6-fa5f146dc47e]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:35:38 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:35:38.040 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[e4691287-37ef-4bd6-ba2a-0575cf4b2018]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:35:38 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:35:38.041 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[24e8a049-876b-4201-bba9-8756c4afa465]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:35:38 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:35:38.057 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[8db85a3c-e46f-4dfb-9720-1217717f5d23]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 540719, 'reachable_time': 36598, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213831, 'error': None, 'target': 'ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:35:38 compute-1 systemd[1]: run-netns-ovnmeta\x2dcc98e8b1\x2d8169\x2d4a08\x2d8b22\x2dcd8a87c017a0.mount: Deactivated successfully.
Jan 26 15:35:38 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:35:38.060 105448 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Jan 26 15:35:38 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:35:38.060 105448 DEBUG oslo.privsep.daemon [-] privsep: reply[a7a65edd-d6b7-4239-bd90-c4624dcfd279]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:35:38 compute-1 nova_compute[183403]: 2026-01-26 15:35:38.394 183407 DEBUG nova.compute.manager [req-6a5b7f89-1008-4a12-abac-a5d1e85c1440 req-a87b30fa-4278-4aaa-994f-672925a1adb2 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee] Received event network-vif-unplugged-abfe8595-6f38-41fe-a0cb-eeaa34a05633 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:35:38 compute-1 nova_compute[183403]: 2026-01-26 15:35:38.395 183407 DEBUG oslo_concurrency.lockutils [req-6a5b7f89-1008-4a12-abac-a5d1e85c1440 req-a87b30fa-4278-4aaa-994f-672925a1adb2 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:35:38 compute-1 nova_compute[183403]: 2026-01-26 15:35:38.395 183407 DEBUG oslo_concurrency.lockutils [req-6a5b7f89-1008-4a12-abac-a5d1e85c1440 req-a87b30fa-4278-4aaa-994f-672925a1adb2 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:35:38 compute-1 nova_compute[183403]: 2026-01-26 15:35:38.395 183407 DEBUG oslo_concurrency.lockutils [req-6a5b7f89-1008-4a12-abac-a5d1e85c1440 req-a87b30fa-4278-4aaa-994f-672925a1adb2 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:35:38 compute-1 nova_compute[183403]: 2026-01-26 15:35:38.395 183407 DEBUG nova.compute.manager [req-6a5b7f89-1008-4a12-abac-a5d1e85c1440 req-a87b30fa-4278-4aaa-994f-672925a1adb2 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee] No waiting events found dispatching network-vif-unplugged-abfe8595-6f38-41fe-a0cb-eeaa34a05633 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:35:38 compute-1 nova_compute[183403]: 2026-01-26 15:35:38.396 183407 DEBUG nova.compute.manager [req-6a5b7f89-1008-4a12-abac-a5d1e85c1440 req-a87b30fa-4278-4aaa-994f-672925a1adb2 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee] Received event network-vif-unplugged-abfe8595-6f38-41fe-a0cb-eeaa34a05633 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 15:35:38 compute-1 nova_compute[183403]: 2026-01-26 15:35:38.514 183407 DEBUG nova.virt.libvirt.vif [None req-c93099a5-38cb-46e5-b742-93bbe9328401 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2026-01-26T15:33:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-624008049',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-6240080',id=26,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T15:33:38Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8f7ddd8ab2ae4841a1f43ae8078bb924',ramdisk_id='',reservation_id='r-s0fi6kdn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,reader,admin,member',clean_attempts='1',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-1156482628',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-1156482628-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T15:35:24Z,user_data=None,user_id='8152e350b54f44cabafc751c752d6f92',uuid=ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "abfe8595-6f38-41fe-a0cb-eeaa34a05633", "address": "fa:16:3e:47:95:ef", "network": {"id": "cc98e8b1-8169-4a08-8b22-cd8a87c017a0", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-1789164558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a18d09bfe0e7479c8a237dd032889317", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapabfe8595-6f", "ovs_interfaceid": "abfe8595-6f38-41fe-a0cb-eeaa34a05633", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 26 15:35:38 compute-1 nova_compute[183403]: 2026-01-26 15:35:38.515 183407 DEBUG nova.network.os_vif_util [None req-c93099a5-38cb-46e5-b742-93bbe9328401 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Converting VIF {"id": "abfe8595-6f38-41fe-a0cb-eeaa34a05633", "address": "fa:16:3e:47:95:ef", "network": {"id": "cc98e8b1-8169-4a08-8b22-cd8a87c017a0", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-1789164558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a18d09bfe0e7479c8a237dd032889317", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapabfe8595-6f", "ovs_interfaceid": "abfe8595-6f38-41fe-a0cb-eeaa34a05633", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:35:38 compute-1 nova_compute[183403]: 2026-01-26 15:35:38.516 183407 DEBUG nova.network.os_vif_util [None req-c93099a5-38cb-46e5-b742-93bbe9328401 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:47:95:ef,bridge_name='br-int',has_traffic_filtering=True,id=abfe8595-6f38-41fe-a0cb-eeaa34a05633,network=Network(cc98e8b1-8169-4a08-8b22-cd8a87c017a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapabfe8595-6f') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:35:38 compute-1 nova_compute[183403]: 2026-01-26 15:35:38.516 183407 DEBUG os_vif [None req-c93099a5-38cb-46e5-b742-93bbe9328401 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:47:95:ef,bridge_name='br-int',has_traffic_filtering=True,id=abfe8595-6f38-41fe-a0cb-eeaa34a05633,network=Network(cc98e8b1-8169-4a08-8b22-cd8a87c017a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapabfe8595-6f') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 26 15:35:38 compute-1 nova_compute[183403]: 2026-01-26 15:35:38.517 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:35:38 compute-1 nova_compute[183403]: 2026-01-26 15:35:38.518 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapabfe8595-6f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:35:38 compute-1 nova_compute[183403]: 2026-01-26 15:35:38.519 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:35:38 compute-1 nova_compute[183403]: 2026-01-26 15:35:38.521 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:35:38 compute-1 nova_compute[183403]: 2026-01-26 15:35:38.521 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:35:38 compute-1 nova_compute[183403]: 2026-01-26 15:35:38.522 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:35:38 compute-1 nova_compute[183403]: 2026-01-26 15:35:38.523 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=a13d8cfe-76fd-4254-a716-57bb36bfc5ab) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:35:38 compute-1 nova_compute[183403]: 2026-01-26 15:35:38.523 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:35:38 compute-1 nova_compute[183403]: 2026-01-26 15:35:38.524 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:35:38 compute-1 nova_compute[183403]: 2026-01-26 15:35:38.526 183407 INFO os_vif [None req-c93099a5-38cb-46e5-b742-93bbe9328401 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:47:95:ef,bridge_name='br-int',has_traffic_filtering=True,id=abfe8595-6f38-41fe-a0cb-eeaa34a05633,network=Network(cc98e8b1-8169-4a08-8b22-cd8a87c017a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapabfe8595-6f')
Jan 26 15:35:38 compute-1 nova_compute[183403]: 2026-01-26 15:35:38.527 183407 INFO nova.virt.libvirt.driver [None req-c93099a5-38cb-46e5-b742-93bbe9328401 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] [instance: ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee] Deleting instance files /var/lib/nova/instances/ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee_del
Jan 26 15:35:38 compute-1 nova_compute[183403]: 2026-01-26 15:35:38.527 183407 INFO nova.virt.libvirt.driver [None req-c93099a5-38cb-46e5-b742-93bbe9328401 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] [instance: ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee] Deletion of /var/lib/nova/instances/ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee_del complete
Jan 26 15:35:39 compute-1 nova_compute[183403]: 2026-01-26 15:35:39.048 183407 INFO nova.compute.manager [None req-c93099a5-38cb-46e5-b742-93bbe9328401 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] [instance: ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee] Took 1.30 seconds to destroy the instance on the hypervisor.
Jan 26 15:35:39 compute-1 nova_compute[183403]: 2026-01-26 15:35:39.049 183407 DEBUG oslo.service.backend._eventlet.loopingcall [None req-c93099a5-38cb-46e5-b742-93bbe9328401 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Jan 26 15:35:39 compute-1 nova_compute[183403]: 2026-01-26 15:35:39.050 183407 DEBUG nova.compute.manager [-] [instance: ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Jan 26 15:35:39 compute-1 nova_compute[183403]: 2026-01-26 15:35:39.050 183407 DEBUG nova.network.neutron [-] [instance: ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Jan 26 15:35:39 compute-1 nova_compute[183403]: 2026-01-26 15:35:39.051 183407 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:35:39 compute-1 nova_compute[183403]: 2026-01-26 15:35:39.305 183407 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:35:40 compute-1 nova_compute[183403]: 2026-01-26 15:35:40.262 183407 DEBUG nova.network.neutron [-] [instance: ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:35:40 compute-1 nova_compute[183403]: 2026-01-26 15:35:40.457 183407 DEBUG nova.compute.manager [req-fa07e922-90a5-45a8-8979-45259bbb5e96 req-a20c98d8-e1c6-4eec-b9ff-f275cab9dbee 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee] Received event network-vif-unplugged-abfe8595-6f38-41fe-a0cb-eeaa34a05633 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:35:40 compute-1 nova_compute[183403]: 2026-01-26 15:35:40.457 183407 DEBUG oslo_concurrency.lockutils [req-fa07e922-90a5-45a8-8979-45259bbb5e96 req-a20c98d8-e1c6-4eec-b9ff-f275cab9dbee 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:35:40 compute-1 nova_compute[183403]: 2026-01-26 15:35:40.457 183407 DEBUG oslo_concurrency.lockutils [req-fa07e922-90a5-45a8-8979-45259bbb5e96 req-a20c98d8-e1c6-4eec-b9ff-f275cab9dbee 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:35:40 compute-1 nova_compute[183403]: 2026-01-26 15:35:40.458 183407 DEBUG oslo_concurrency.lockutils [req-fa07e922-90a5-45a8-8979-45259bbb5e96 req-a20c98d8-e1c6-4eec-b9ff-f275cab9dbee 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:35:40 compute-1 nova_compute[183403]: 2026-01-26 15:35:40.458 183407 DEBUG nova.compute.manager [req-fa07e922-90a5-45a8-8979-45259bbb5e96 req-a20c98d8-e1c6-4eec-b9ff-f275cab9dbee 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee] No waiting events found dispatching network-vif-unplugged-abfe8595-6f38-41fe-a0cb-eeaa34a05633 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:35:40 compute-1 nova_compute[183403]: 2026-01-26 15:35:40.458 183407 DEBUG nova.compute.manager [req-fa07e922-90a5-45a8-8979-45259bbb5e96 req-a20c98d8-e1c6-4eec-b9ff-f275cab9dbee 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee] Received event network-vif-unplugged-abfe8595-6f38-41fe-a0cb-eeaa34a05633 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 15:35:40 compute-1 nova_compute[183403]: 2026-01-26 15:35:40.458 183407 DEBUG nova.compute.manager [req-fa07e922-90a5-45a8-8979-45259bbb5e96 req-a20c98d8-e1c6-4eec-b9ff-f275cab9dbee 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee] Received event network-vif-deleted-abfe8595-6f38-41fe-a0cb-eeaa34a05633 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:35:40 compute-1 nova_compute[183403]: 2026-01-26 15:35:40.768 183407 INFO nova.compute.manager [-] [instance: ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee] Took 1.72 seconds to deallocate network for instance.
Jan 26 15:35:41 compute-1 nova_compute[183403]: 2026-01-26 15:35:41.292 183407 DEBUG oslo_concurrency.lockutils [None req-c93099a5-38cb-46e5-b742-93bbe9328401 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:35:41 compute-1 nova_compute[183403]: 2026-01-26 15:35:41.293 183407 DEBUG oslo_concurrency.lockutils [None req-c93099a5-38cb-46e5-b742-93bbe9328401 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:35:41 compute-1 nova_compute[183403]: 2026-01-26 15:35:41.356 183407 DEBUG nova.compute.provider_tree [None req-c93099a5-38cb-46e5-b742-93bbe9328401 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:35:41 compute-1 nova_compute[183403]: 2026-01-26 15:35:41.864 183407 DEBUG nova.scheduler.client.report [None req-c93099a5-38cb-46e5-b742-93bbe9328401 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:35:42 compute-1 nova_compute[183403]: 2026-01-26 15:35:42.053 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:35:42 compute-1 nova_compute[183403]: 2026-01-26 15:35:42.376 183407 DEBUG oslo_concurrency.lockutils [None req-c93099a5-38cb-46e5-b742-93bbe9328401 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.084s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:35:42 compute-1 nova_compute[183403]: 2026-01-26 15:35:42.411 183407 INFO nova.scheduler.client.report [None req-c93099a5-38cb-46e5-b742-93bbe9328401 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Deleted allocations for instance ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee
Jan 26 15:35:43 compute-1 nova_compute[183403]: 2026-01-26 15:35:43.445 183407 DEBUG oslo_concurrency.lockutils [None req-c93099a5-38cb-46e5-b742-93bbe9328401 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Lock "ad7ae9cf-9712-4b5c-9de9-0bb8d8f8c8ee" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.231s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:35:43 compute-1 nova_compute[183403]: 2026-01-26 15:35:43.525 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:35:43 compute-1 podman[213833]: 2026-01-26 15:35:43.908091562 +0000 UTC m=+0.070797754 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260120, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 26 15:35:43 compute-1 podman[213832]: 2026-01-26 15:35:43.936127978 +0000 UTC m=+0.105220604 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.build-date=20260120, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 26 15:35:47 compute-1 nova_compute[183403]: 2026-01-26 15:35:47.071 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:35:48 compute-1 nova_compute[183403]: 2026-01-26 15:35:48.527 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:35:49 compute-1 openstack_network_exporter[195610]: ERROR   15:35:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:35:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:35:49 compute-1 openstack_network_exporter[195610]: ERROR   15:35:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:35:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:35:51 compute-1 nova_compute[183403]: 2026-01-26 15:35:51.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:35:51 compute-1 nova_compute[183403]: 2026-01-26 15:35:51.577 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:35:51 compute-1 nova_compute[183403]: 2026-01-26 15:35:51.578 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:35:51 compute-1 nova_compute[183403]: 2026-01-26 15:35:51.578 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:35:51 compute-1 nova_compute[183403]: 2026-01-26 15:35:51.579 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:35:51 compute-1 nova_compute[183403]: 2026-01-26 15:35:51.579 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:35:51 compute-1 nova_compute[183403]: 2026-01-26 15:35:51.580 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:35:52 compute-1 nova_compute[183403]: 2026-01-26 15:35:52.056 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:35:52 compute-1 nova_compute[183403]: 2026-01-26 15:35:52.595 183407 DEBUG nova.virt.libvirt.imagecache [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:314
Jan 26 15:35:52 compute-1 nova_compute[183403]: 2026-01-26 15:35:52.595 183407 WARNING nova.virt.libvirt.imagecache [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0
Jan 26 15:35:52 compute-1 nova_compute[183403]: 2026-01-26 15:35:52.596 183407 INFO nova.virt.libvirt.imagecache [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Removable base files: /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0
Jan 26 15:35:52 compute-1 nova_compute[183403]: 2026-01-26 15:35:52.596 183407 INFO nova.virt.libvirt.imagecache [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0
Jan 26 15:35:52 compute-1 nova_compute[183403]: 2026-01-26 15:35:52.596 183407 DEBUG nova.virt.libvirt.imagecache [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:350
Jan 26 15:35:52 compute-1 nova_compute[183403]: 2026-01-26 15:35:52.596 183407 DEBUG nova.virt.libvirt.imagecache [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:299
Jan 26 15:35:52 compute-1 nova_compute[183403]: 2026-01-26 15:35:52.597 183407 DEBUG nova.virt.libvirt.imagecache [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:284
Jan 26 15:35:53 compute-1 nova_compute[183403]: 2026-01-26 15:35:53.529 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:35:53 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:35:53.975 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=32, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:ac:38', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'de:96:40:90:f3:e5'}, ipsec=False) old=SB_Global(nb_cfg=31) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:35:53 compute-1 nova_compute[183403]: 2026-01-26 15:35:53.976 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:35:53 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:35:53.976 104930 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 15:35:57 compute-1 nova_compute[183403]: 2026-01-26 15:35:57.059 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:35:58 compute-1 nova_compute[183403]: 2026-01-26 15:35:58.530 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:36:00 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:36:00.977 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=41380b5a-e321-4ce4-bcc6-ecd563b3c793, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '32'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:36:02 compute-1 nova_compute[183403]: 2026-01-26 15:36:02.061 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:36:03 compute-1 nova_compute[183403]: 2026-01-26 15:36:03.533 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:36:03 compute-1 podman[213880]: 2026-01-26 15:36:03.906305298 +0000 UTC m=+0.084013425 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, version=9.6, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, release=1755695350, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 26 15:36:03 compute-1 podman[213879]: 2026-01-26 15:36:03.92410848 +0000 UTC m=+0.100580672 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 15:36:05 compute-1 podman[192725]: time="2026-01-26T15:36:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:36:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:36:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:36:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:36:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2194 "" "Go-http-client/1.1"
Jan 26 15:36:07 compute-1 nova_compute[183403]: 2026-01-26 15:36:07.063 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:36:08 compute-1 nova_compute[183403]: 2026-01-26 15:36:08.535 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:36:12 compute-1 nova_compute[183403]: 2026-01-26 15:36:12.067 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:36:13 compute-1 nova_compute[183403]: 2026-01-26 15:36:13.546 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:36:14 compute-1 podman[213925]: 2026-01-26 15:36:14.87397654 +0000 UTC m=+0.048086532 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 26 15:36:14 compute-1 podman[213924]: 2026-01-26 15:36:14.928032954 +0000 UTC m=+0.108083156 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 15:36:17 compute-1 nova_compute[183403]: 2026-01-26 15:36:17.070 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:36:18 compute-1 nova_compute[183403]: 2026-01-26 15:36:18.549 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:36:19 compute-1 openstack_network_exporter[195610]: ERROR   15:36:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:36:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:36:19 compute-1 openstack_network_exporter[195610]: ERROR   15:36:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:36:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:36:22 compute-1 nova_compute[183403]: 2026-01-26 15:36:22.071 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:36:22 compute-1 nova_compute[183403]: 2026-01-26 15:36:22.596 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:36:23 compute-1 nova_compute[183403]: 2026-01-26 15:36:23.550 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:36:27 compute-1 nova_compute[183403]: 2026-01-26 15:36:27.073 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:36:27 compute-1 nova_compute[183403]: 2026-01-26 15:36:27.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:36:28 compute-1 nova_compute[183403]: 2026-01-26 15:36:28.551 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:36:28 compute-1 nova_compute[183403]: 2026-01-26 15:36:28.575 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:36:29 compute-1 nova_compute[183403]: 2026-01-26 15:36:29.090 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:36:29 compute-1 nova_compute[183403]: 2026-01-26 15:36:29.091 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:36:29 compute-1 nova_compute[183403]: 2026-01-26 15:36:29.091 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:36:29 compute-1 nova_compute[183403]: 2026-01-26 15:36:29.091 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:36:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:36:29.092 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:36:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:36:29.093 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:36:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:36:29.093 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:36:29 compute-1 nova_compute[183403]: 2026-01-26 15:36:29.251 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:36:29 compute-1 nova_compute[183403]: 2026-01-26 15:36:29.252 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:36:29 compute-1 nova_compute[183403]: 2026-01-26 15:36:29.272 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.020s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:36:29 compute-1 nova_compute[183403]: 2026-01-26 15:36:29.273 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5853MB free_disk=73.14492416381836GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:36:29 compute-1 nova_compute[183403]: 2026-01-26 15:36:29.273 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:36:29 compute-1 nova_compute[183403]: 2026-01-26 15:36:29.273 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:36:30 compute-1 nova_compute[183403]: 2026-01-26 15:36:30.317 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:36:30 compute-1 nova_compute[183403]: 2026-01-26 15:36:30.318 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:36:29 up  1:31,  0 user,  load average: 0.08, 0.14, 0.20\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:36:30 compute-1 nova_compute[183403]: 2026-01-26 15:36:30.345 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:36:30 compute-1 nova_compute[183403]: 2026-01-26 15:36:30.854 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:36:31 compute-1 nova_compute[183403]: 2026-01-26 15:36:31.365 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:36:31 compute-1 nova_compute[183403]: 2026-01-26 15:36:31.366 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.093s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:36:31 compute-1 nova_compute[183403]: 2026-01-26 15:36:31.366 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:36:31 compute-1 nova_compute[183403]: 2026-01-26 15:36:31.367 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Jan 26 15:36:31 compute-1 nova_compute[183403]: 2026-01-26 15:36:31.875 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Jan 26 15:36:32 compute-1 nova_compute[183403]: 2026-01-26 15:36:32.077 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:36:32 compute-1 nova_compute[183403]: 2026-01-26 15:36:32.577 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:36:32 compute-1 nova_compute[183403]: 2026-01-26 15:36:32.578 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:36:32 compute-1 nova_compute[183403]: 2026-01-26 15:36:32.578 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:36:32 compute-1 nova_compute[183403]: 2026-01-26 15:36:32.578 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:36:32 compute-1 nova_compute[183403]: 2026-01-26 15:36:32.579 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 15:36:32 compute-1 nova_compute[183403]: 2026-01-26 15:36:32.579 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:36:32 compute-1 nova_compute[183403]: 2026-01-26 15:36:32.579 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Jan 26 15:36:33 compute-1 nova_compute[183403]: 2026-01-26 15:36:33.553 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:36:34 compute-1 nova_compute[183403]: 2026-01-26 15:36:34.081 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:36:34 compute-1 podman[213972]: 2026-01-26 15:36:34.881551033 +0000 UTC m=+0.056984329 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 15:36:34 compute-1 podman[213973]: 2026-01-26 15:36:34.917348713 +0000 UTC m=+0.087625610 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, maintainer=Red Hat, Inc., vcs-type=git, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.openshift.tags=minimal rhel9, release=1755695350, architecture=x86_64, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 26 15:36:35 compute-1 podman[192725]: time="2026-01-26T15:36:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:36:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:36:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:36:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:36:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2193 "" "Go-http-client/1.1"
Jan 26 15:36:37 compute-1 nova_compute[183403]: 2026-01-26 15:36:37.080 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:36:38 compute-1 nova_compute[183403]: 2026-01-26 15:36:38.556 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:36:41 compute-1 nova_compute[183403]: 2026-01-26 15:36:41.571 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:36:42 compute-1 nova_compute[183403]: 2026-01-26 15:36:42.080 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:36:43 compute-1 nova_compute[183403]: 2026-01-26 15:36:43.558 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:36:45 compute-1 podman[214019]: 2026-01-26 15:36:45.889497023 +0000 UTC m=+0.062025704 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 26 15:36:45 compute-1 podman[214018]: 2026-01-26 15:36:45.921502473 +0000 UTC m=+0.098896154 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260120)
Jan 26 15:36:47 compute-1 nova_compute[183403]: 2026-01-26 15:36:47.084 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:36:48 compute-1 nova_compute[183403]: 2026-01-26 15:36:48.560 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:36:49 compute-1 openstack_network_exporter[195610]: ERROR   15:36:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:36:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:36:49 compute-1 openstack_network_exporter[195610]: ERROR   15:36:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:36:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:36:50 compute-1 nova_compute[183403]: 2026-01-26 15:36:50.445 183407 DEBUG nova.virt.libvirt.driver [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: ddc6e2e7-fe6a-4589-a4e8-00138e842f1d] Creating tmpfile /var/lib/nova/instances/tmp3a33oorc to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Jan 26 15:36:50 compute-1 nova_compute[183403]: 2026-01-26 15:36:50.447 183407 WARNING neutronclient.v2_0.client [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:36:50 compute-1 nova_compute[183403]: 2026-01-26 15:36:50.448 183407 DEBUG nova.virt.libvirt.driver [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 5f9467db-9e52-4153-bd8b-23e40544aee2] Creating tmpfile /var/lib/nova/instances/tmpmga2mrby to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Jan 26 15:36:50 compute-1 nova_compute[183403]: 2026-01-26 15:36:50.448 183407 WARNING neutronclient.v2_0.client [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:36:50 compute-1 nova_compute[183403]: 2026-01-26 15:36:50.452 183407 DEBUG nova.compute.manager [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp3a33oorc',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9090
Jan 26 15:36:50 compute-1 nova_compute[183403]: 2026-01-26 15:36:50.457 183407 DEBUG nova.compute.manager [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpmga2mrby',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9090
Jan 26 15:36:51 compute-1 ovn_controller[95641]: 2026-01-26T15:36:51Z|00214|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Jan 26 15:36:51 compute-1 nova_compute[183403]: 2026-01-26 15:36:51.997 183407 WARNING neutronclient.v2_0.client [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:36:51 compute-1 nova_compute[183403]: 2026-01-26 15:36:51.999 183407 WARNING neutronclient.v2_0.client [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:36:52 compute-1 nova_compute[183403]: 2026-01-26 15:36:52.085 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:36:53 compute-1 nova_compute[183403]: 2026-01-26 15:36:53.563 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:36:56 compute-1 nova_compute[183403]: 2026-01-26 15:36:56.011 183407 DEBUG nova.compute.manager [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpmga2mrby',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='5f9467db-9e52-4153-bd8b-23e40544aee2',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9315
Jan 26 15:36:57 compute-1 nova_compute[183403]: 2026-01-26 15:36:57.027 183407 DEBUG oslo_concurrency.lockutils [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "refresh_cache-5f9467db-9e52-4153-bd8b-23e40544aee2" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 15:36:57 compute-1 nova_compute[183403]: 2026-01-26 15:36:57.028 183407 DEBUG oslo_concurrency.lockutils [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquired lock "refresh_cache-5f9467db-9e52-4153-bd8b-23e40544aee2" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 15:36:57 compute-1 nova_compute[183403]: 2026-01-26 15:36:57.028 183407 DEBUG nova.network.neutron [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 5f9467db-9e52-4153-bd8b-23e40544aee2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 15:36:57 compute-1 nova_compute[183403]: 2026-01-26 15:36:57.089 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:36:57 compute-1 nova_compute[183403]: 2026-01-26 15:36:57.534 183407 WARNING neutronclient.v2_0.client [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:36:57 compute-1 nova_compute[183403]: 2026-01-26 15:36:57.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:36:58 compute-1 nova_compute[183403]: 2026-01-26 15:36:58.566 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:36:59 compute-1 nova_compute[183403]: 2026-01-26 15:36:59.322 183407 WARNING neutronclient.v2_0.client [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:36:59 compute-1 nova_compute[183403]: 2026-01-26 15:36:59.505 183407 DEBUG nova.network.neutron [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 5f9467db-9e52-4153-bd8b-23e40544aee2] Updating instance_info_cache with network_info: [{"id": "fc6d9d4f-93a9-4b14-81b9-6d4d927635b5", "address": "fa:16:3e:83:18:02", "network": {"id": "cc98e8b1-8169-4a08-8b22-cd8a87c017a0", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-1789164558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a18d09bfe0e7479c8a237dd032889317", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc6d9d4f-93", "ovs_interfaceid": "fc6d9d4f-93a9-4b14-81b9-6d4d927635b5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:37:00 compute-1 nova_compute[183403]: 2026-01-26 15:37:00.013 183407 DEBUG oslo_concurrency.lockutils [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Releasing lock "refresh_cache-5f9467db-9e52-4153-bd8b-23e40544aee2" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 15:37:00 compute-1 nova_compute[183403]: 2026-01-26 15:37:00.028 183407 DEBUG nova.virt.libvirt.driver [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 5f9467db-9e52-4153-bd8b-23e40544aee2] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpmga2mrby',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='5f9467db-9e52-4153-bd8b-23e40544aee2',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Jan 26 15:37:00 compute-1 nova_compute[183403]: 2026-01-26 15:37:00.029 183407 DEBUG nova.virt.libvirt.driver [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 5f9467db-9e52-4153-bd8b-23e40544aee2] Creating instance directory: /var/lib/nova/instances/5f9467db-9e52-4153-bd8b-23e40544aee2 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Jan 26 15:37:00 compute-1 nova_compute[183403]: 2026-01-26 15:37:00.029 183407 DEBUG nova.virt.libvirt.driver [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 5f9467db-9e52-4153-bd8b-23e40544aee2] Creating disk.info with the contents: {'/var/lib/nova/instances/5f9467db-9e52-4153-bd8b-23e40544aee2/disk': 'qcow2', '/var/lib/nova/instances/5f9467db-9e52-4153-bd8b-23e40544aee2/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Jan 26 15:37:00 compute-1 nova_compute[183403]: 2026-01-26 15:37:00.030 183407 DEBUG nova.virt.libvirt.driver [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 5f9467db-9e52-4153-bd8b-23e40544aee2] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Jan 26 15:37:00 compute-1 nova_compute[183403]: 2026-01-26 15:37:00.030 183407 DEBUG nova.objects.instance [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lazy-loading 'trusted_certs' on Instance uuid 5f9467db-9e52-4153-bd8b-23e40544aee2 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:37:00 compute-1 nova_compute[183403]: 2026-01-26 15:37:00.536 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:37:00 compute-1 nova_compute[183403]: 2026-01-26 15:37:00.540 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:37:00 compute-1 nova_compute[183403]: 2026-01-26 15:37:00.542 183407 DEBUG oslo_concurrency.processutils [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:37:00 compute-1 nova_compute[183403]: 2026-01-26 15:37:00.629 183407 DEBUG oslo_concurrency.processutils [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:37:00 compute-1 nova_compute[183403]: 2026-01-26 15:37:00.630 183407 DEBUG oslo_concurrency.lockutils [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:37:00 compute-1 nova_compute[183403]: 2026-01-26 15:37:00.631 183407 DEBUG oslo_concurrency.lockutils [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:37:00 compute-1 nova_compute[183403]: 2026-01-26 15:37:00.632 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:37:00 compute-1 nova_compute[183403]: 2026-01-26 15:37:00.638 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:37:00 compute-1 nova_compute[183403]: 2026-01-26 15:37:00.638 183407 DEBUG oslo_concurrency.processutils [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:37:00 compute-1 nova_compute[183403]: 2026-01-26 15:37:00.700 183407 DEBUG oslo_concurrency.processutils [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:37:00 compute-1 nova_compute[183403]: 2026-01-26 15:37:00.701 183407 DEBUG oslo_concurrency.processutils [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0,backing_fmt=raw /var/lib/nova/instances/5f9467db-9e52-4153-bd8b-23e40544aee2/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:37:01 compute-1 nova_compute[183403]: 2026-01-26 15:37:01.517 183407 DEBUG oslo_concurrency.processutils [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0,backing_fmt=raw /var/lib/nova/instances/5f9467db-9e52-4153-bd8b-23e40544aee2/disk 1073741824" returned: 0 in 0.816s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:37:01 compute-1 nova_compute[183403]: 2026-01-26 15:37:01.519 183407 DEBUG oslo_concurrency.lockutils [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.887s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:37:01 compute-1 nova_compute[183403]: 2026-01-26 15:37:01.519 183407 DEBUG oslo_concurrency.processutils [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:37:01 compute-1 nova_compute[183403]: 2026-01-26 15:37:01.606 183407 DEBUG oslo_concurrency.processutils [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:37:01 compute-1 nova_compute[183403]: 2026-01-26 15:37:01.607 183407 DEBUG nova.virt.disk.api [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Checking if we can resize image /var/lib/nova/instances/5f9467db-9e52-4153-bd8b-23e40544aee2/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 26 15:37:01 compute-1 nova_compute[183403]: 2026-01-26 15:37:01.608 183407 DEBUG oslo_concurrency.processutils [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5f9467db-9e52-4153-bd8b-23e40544aee2/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:37:01 compute-1 nova_compute[183403]: 2026-01-26 15:37:01.668 183407 DEBUG oslo_concurrency.processutils [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5f9467db-9e52-4153-bd8b-23e40544aee2/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:37:01 compute-1 nova_compute[183403]: 2026-01-26 15:37:01.669 183407 DEBUG nova.virt.disk.api [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Cannot resize image /var/lib/nova/instances/5f9467db-9e52-4153-bd8b-23e40544aee2/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 26 15:37:01 compute-1 nova_compute[183403]: 2026-01-26 15:37:01.669 183407 DEBUG nova.objects.instance [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lazy-loading 'migration_context' on Instance uuid 5f9467db-9e52-4153-bd8b-23e40544aee2 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:37:01 compute-1 anacron[203574]: Job `cron.daily' started
Jan 26 15:37:01 compute-1 anacron[203574]: Job `cron.daily' terminated
Jan 26 15:37:02 compute-1 nova_compute[183403]: 2026-01-26 15:37:02.136 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:37:02 compute-1 nova_compute[183403]: 2026-01-26 15:37:02.182 183407 DEBUG nova.objects.base [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Object Instance<5f9467db-9e52-4153-bd8b-23e40544aee2> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Jan 26 15:37:02 compute-1 nova_compute[183403]: 2026-01-26 15:37:02.183 183407 DEBUG oslo_concurrency.processutils [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/5f9467db-9e52-4153-bd8b-23e40544aee2/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:37:02 compute-1 nova_compute[183403]: 2026-01-26 15:37:02.218 183407 DEBUG oslo_concurrency.processutils [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/5f9467db-9e52-4153-bd8b-23e40544aee2/disk.config 497664" returned: 0 in 0.035s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:37:02 compute-1 nova_compute[183403]: 2026-01-26 15:37:02.219 183407 DEBUG nova.virt.libvirt.driver [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 5f9467db-9e52-4153-bd8b-23e40544aee2] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Jan 26 15:37:02 compute-1 nova_compute[183403]: 2026-01-26 15:37:02.220 183407 DEBUG nova.virt.libvirt.vif [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-26T15:36:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-117472076',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-1174720',id=29,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T15:36:22Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='8f7ddd8ab2ae4841a1f43ae8078bb924',ramdisk_id='',reservation_id='r-lt5qszy7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,reader,admin,member',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-1156482628',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-1156482628-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T15:36:22Z,user_data=None,user_id='8152e350b54f44cabafc751c752d6f92',uuid=5f9467db-9e52-4153-bd8b-23e40544aee2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fc6d9d4f-93a9-4b14-81b9-6d4d927635b5", "address": "fa:16:3e:83:18:02", "network": {"id": "cc98e8b1-8169-4a08-8b22-cd8a87c017a0", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-1789164558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a18d09bfe0e7479c8a237dd032889317", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapfc6d9d4f-93", "ovs_interfaceid": "fc6d9d4f-93a9-4b14-81b9-6d4d927635b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 26 15:37:02 compute-1 nova_compute[183403]: 2026-01-26 15:37:02.220 183407 DEBUG nova.network.os_vif_util [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Converting VIF {"id": "fc6d9d4f-93a9-4b14-81b9-6d4d927635b5", "address": "fa:16:3e:83:18:02", "network": {"id": "cc98e8b1-8169-4a08-8b22-cd8a87c017a0", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-1789164558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a18d09bfe0e7479c8a237dd032889317", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapfc6d9d4f-93", "ovs_interfaceid": "fc6d9d4f-93a9-4b14-81b9-6d4d927635b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:37:02 compute-1 nova_compute[183403]: 2026-01-26 15:37:02.221 183407 DEBUG nova.network.os_vif_util [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:83:18:02,bridge_name='br-int',has_traffic_filtering=True,id=fc6d9d4f-93a9-4b14-81b9-6d4d927635b5,network=Network(cc98e8b1-8169-4a08-8b22-cd8a87c017a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc6d9d4f-93') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:37:02 compute-1 nova_compute[183403]: 2026-01-26 15:37:02.221 183407 DEBUG os_vif [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:83:18:02,bridge_name='br-int',has_traffic_filtering=True,id=fc6d9d4f-93a9-4b14-81b9-6d4d927635b5,network=Network(cc98e8b1-8169-4a08-8b22-cd8a87c017a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc6d9d4f-93') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 26 15:37:02 compute-1 nova_compute[183403]: 2026-01-26 15:37:02.222 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:37:02 compute-1 nova_compute[183403]: 2026-01-26 15:37:02.222 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:37:02 compute-1 nova_compute[183403]: 2026-01-26 15:37:02.223 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:37:02 compute-1 nova_compute[183403]: 2026-01-26 15:37:02.223 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:37:02 compute-1 nova_compute[183403]: 2026-01-26 15:37:02.223 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '1a2d36dc-ec61-5a42-854c-2a1d6b2cebf6', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:37:02 compute-1 nova_compute[183403]: 2026-01-26 15:37:02.225 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:37:02 compute-1 nova_compute[183403]: 2026-01-26 15:37:02.227 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:37:02 compute-1 nova_compute[183403]: 2026-01-26 15:37:02.231 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:37:02 compute-1 nova_compute[183403]: 2026-01-26 15:37:02.231 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfc6d9d4f-93, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:37:02 compute-1 nova_compute[183403]: 2026-01-26 15:37:02.232 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapfc6d9d4f-93, col_values=(('qos', UUID('b597b0ec-31e1-4960-ae31-26ca4c6f5403')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:37:02 compute-1 nova_compute[183403]: 2026-01-26 15:37:02.232 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapfc6d9d4f-93, col_values=(('external_ids', {'iface-id': 'fc6d9d4f-93a9-4b14-81b9-6d4d927635b5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:83:18:02', 'vm-uuid': '5f9467db-9e52-4153-bd8b-23e40544aee2'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:37:02 compute-1 nova_compute[183403]: 2026-01-26 15:37:02.234 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:37:02 compute-1 NetworkManager[55716]: <info>  [1769441822.2359] manager: (tapfc6d9d4f-93): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/81)
Jan 26 15:37:02 compute-1 nova_compute[183403]: 2026-01-26 15:37:02.236 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:37:02 compute-1 nova_compute[183403]: 2026-01-26 15:37:02.243 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:37:02 compute-1 nova_compute[183403]: 2026-01-26 15:37:02.244 183407 INFO os_vif [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:83:18:02,bridge_name='br-int',has_traffic_filtering=True,id=fc6d9d4f-93a9-4b14-81b9-6d4d927635b5,network=Network(cc98e8b1-8169-4a08-8b22-cd8a87c017a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc6d9d4f-93')
Jan 26 15:37:02 compute-1 nova_compute[183403]: 2026-01-26 15:37:02.244 183407 DEBUG nova.virt.libvirt.driver [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Jan 26 15:37:02 compute-1 nova_compute[183403]: 2026-01-26 15:37:02.244 183407 DEBUG nova.compute.manager [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpmga2mrby',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='5f9467db-9e52-4153-bd8b-23e40544aee2',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9381
Jan 26 15:37:02 compute-1 nova_compute[183403]: 2026-01-26 15:37:02.245 183407 WARNING neutronclient.v2_0.client [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:37:02 compute-1 nova_compute[183403]: 2026-01-26 15:37:02.364 183407 WARNING neutronclient.v2_0.client [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:37:02 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:37:02.674 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=33, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:ac:38', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'de:96:40:90:f3:e5'}, ipsec=False) old=SB_Global(nb_cfg=32) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:37:02 compute-1 nova_compute[183403]: 2026-01-26 15:37:02.674 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:37:02 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:37:02.675 104930 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 15:37:02 compute-1 nova_compute[183403]: 2026-01-26 15:37:02.977 183407 DEBUG nova.network.neutron [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 5f9467db-9e52-4153-bd8b-23e40544aee2] Port fc6d9d4f-93a9-4b14-81b9-6d4d927635b5 updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Jan 26 15:37:02 compute-1 nova_compute[183403]: 2026-01-26 15:37:02.988 183407 DEBUG nova.compute.manager [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpmga2mrby',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='5f9467db-9e52-4153-bd8b-23e40544aee2',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9447
Jan 26 15:37:03 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:37:03.677 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=41380b5a-e321-4ce4-bcc6-ecd563b3c793, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '33'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:37:05 compute-1 podman[192725]: time="2026-01-26T15:37:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:37:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:37:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:37:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:37:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2195 "" "Go-http-client/1.1"
Jan 26 15:37:05 compute-1 podman[214083]: 2026-01-26 15:37:05.877237937 +0000 UTC m=+0.054574450 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 26 15:37:05 compute-1 podman[214084]: 2026-01-26 15:37:05.88462586 +0000 UTC m=+0.059613935 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, name=ubi9-minimal, release=1755695350, distribution-scope=public, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vendor=Red Hat, Inc., container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, managed_by=edpm_ansible)
Jan 26 15:37:06 compute-1 kernel: tapfc6d9d4f-93: entered promiscuous mode
Jan 26 15:37:06 compute-1 ovn_controller[95641]: 2026-01-26T15:37:06Z|00215|binding|INFO|Claiming lport fc6d9d4f-93a9-4b14-81b9-6d4d927635b5 for this additional chassis.
Jan 26 15:37:06 compute-1 ovn_controller[95641]: 2026-01-26T15:37:06Z|00216|binding|INFO|fc6d9d4f-93a9-4b14-81b9-6d4d927635b5: Claiming fa:16:3e:83:18:02 10.100.0.13
Jan 26 15:37:06 compute-1 nova_compute[183403]: 2026-01-26 15:37:06.568 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:37:06 compute-1 NetworkManager[55716]: <info>  [1769441826.5716] manager: (tapfc6d9d4f-93): new Tun device (/org/freedesktop/NetworkManager/Devices/82)
Jan 26 15:37:06 compute-1 nova_compute[183403]: 2026-01-26 15:37:06.571 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:37:06.580 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:83:18:02 10.100.0.13'], port_security=['fa:16:3e:83:18:02 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '5f9467db-9e52-4153-bd8b-23e40544aee2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc98e8b1-8169-4a08-8b22-cd8a87c017a0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f7ddd8ab2ae4841a1f43ae8078bb924', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'c563489a-e307-4381-b22d-8f22c6dbbfd6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=76661f47-b7c7-4131-9a1a-0f8828404115, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[], logical_port=fc6d9d4f-93a9-4b14-81b9-6d4d927635b5) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:37:06 compute-1 ovn_controller[95641]: 2026-01-26T15:37:06Z|00217|binding|INFO|Setting lport fc6d9d4f-93a9-4b14-81b9-6d4d927635b5 ovn-installed in OVS
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:37:06.582 104930 INFO neutron.agent.ovn.metadata.agent [-] Port fc6d9d4f-93a9-4b14-81b9-6d4d927635b5 in datapath cc98e8b1-8169-4a08-8b22-cd8a87c017a0 unbound from our chassis
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:37:06.583 104930 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network cc98e8b1-8169-4a08-8b22-cd8a87c017a0
Jan 26 15:37:06 compute-1 nova_compute[183403]: 2026-01-26 15:37:06.583 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:37:06 compute-1 nova_compute[183403]: 2026-01-26 15:37:06.585 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:37:06.596 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[f6a240ed-0fb4-4d57-a6ee-b3709d7395f7]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:37:06.598 104930 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapcc98e8b1-81 in ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:37:06.599 203506 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapcc98e8b1-80 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:37:06.600 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[0242acf0-46d0-480a-b1d1-f4ba2587de1f]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:37:06.600 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[da444875-dccc-459d-91ad-cfe9ca74fbff]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:37:06 compute-1 systemd-udevd[214143]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 15:37:06 compute-1 systemd-machined[154697]: New machine qemu-20-instance-0000001d.
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:37:06.612 105448 DEBUG oslo.privsep.daemon [-] privsep: reply[ff3db050-cc15-4652-be11-e64cff25f76e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:37:06 compute-1 NetworkManager[55716]: <info>  [1769441826.6192] device (tapfc6d9d4f-93): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:37:06.618 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[40e54c58-ed42-46dd-ab2e-2891bebda04c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:37:06 compute-1 NetworkManager[55716]: <info>  [1769441826.6200] device (tapfc6d9d4f-93): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 15:37:06 compute-1 systemd[1]: Started Virtual Machine qemu-20-instance-0000001d.
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:37:06.650 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[93fd152f-977d-4920-a6c8-71ccd0390b56]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:37:06.653 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[325c2372-71c3-4449-80d3-ac50901578b2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:37:06 compute-1 NetworkManager[55716]: <info>  [1769441826.6550] manager: (tapcc98e8b1-80): new Veth device (/org/freedesktop/NetworkManager/Devices/83)
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:37:06.693 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[dc7bd4ac-e794-4f92-9087-f479e634da54]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:37:06.695 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[400e6a53-2c1c-486e-9e4b-14830cefafa2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:37:06 compute-1 NetworkManager[55716]: <info>  [1769441826.7230] device (tapcc98e8b1-80): carrier: link connected
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:37:06.730 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[2c19f63e-44dc-4632-b373-06b15ec4e021]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:37:06.748 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[e42cd691-0c33-4cc4-8b5f-003ace1cf7fc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcc98e8b1-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:92:f7:3a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 61], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 554932, 'reachable_time': 37014, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214175, 'error': None, 'target': 'ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:37:06.764 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[a2a8876f-9d75-41aa-8798-b9c473b7324c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe92:f73a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 554932, 'tstamp': 554932}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214176, 'error': None, 'target': 'ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:37:06.784 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[d6cbc804-2701-40fd-9af9-7e46f2d2d594]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcc98e8b1-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:92:f7:3a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 61], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 554932, 'reachable_time': 37014, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214177, 'error': None, 'target': 'ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:37:06.817 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[496a6965-c46a-42ae-acdb-92e13840b8cb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:37:06.884 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[bc6a5727-c61a-4ab6-bff3-afdf1dfef09f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:37:06.886 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcc98e8b1-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:37:06.886 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:37:06.886 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcc98e8b1-80, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:37:06 compute-1 NetworkManager[55716]: <info>  [1769441826.8893] manager: (tapcc98e8b1-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/84)
Jan 26 15:37:06 compute-1 nova_compute[183403]: 2026-01-26 15:37:06.889 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:37:06 compute-1 kernel: tapcc98e8b1-80: entered promiscuous mode
Jan 26 15:37:06 compute-1 nova_compute[183403]: 2026-01-26 15:37:06.891 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:37:06.891 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapcc98e8b1-80, col_values=(('external_ids', {'iface-id': '219edf33-3765-4d7c-87b5-4ab0ed1d6a8a'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:37:06 compute-1 nova_compute[183403]: 2026-01-26 15:37:06.892 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:37:06 compute-1 ovn_controller[95641]: 2026-01-26T15:37:06Z|00218|binding|INFO|Releasing lport 219edf33-3765-4d7c-87b5-4ab0ed1d6a8a from this chassis (sb_readonly=0)
Jan 26 15:37:06 compute-1 nova_compute[183403]: 2026-01-26 15:37:06.905 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:37:06.906 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[96cd1570-7872-49ec-bf38-26c49758fc9d]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:37:06.907 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/cc98e8b1-8169-4a08-8b22-cd8a87c017a0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/cc98e8b1-8169-4a08-8b22-cd8a87c017a0.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:37:06.907 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/cc98e8b1-8169-4a08-8b22-cd8a87c017a0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/cc98e8b1-8169-4a08-8b22-cd8a87c017a0.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:37:06.907 104930 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for cc98e8b1-8169-4a08-8b22-cd8a87c017a0 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:37:06.907 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/cc98e8b1-8169-4a08-8b22-cd8a87c017a0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/cc98e8b1-8169-4a08-8b22-cd8a87c017a0.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:37:06.908 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[c8c0b63c-1ad2-4eef-840c-9b7fcbd42a04]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:37:06.908 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/cc98e8b1-8169-4a08-8b22-cd8a87c017a0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/cc98e8b1-8169-4a08-8b22-cd8a87c017a0.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:37:06.909 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[c7192d6b-9f83-4c0e-8e8f-7b6ab979c3a4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:37:06.909 104930 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]: global
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]:     log         /dev/log local0 debug
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]:     log-tag     haproxy-metadata-proxy-cc98e8b1-8169-4a08-8b22-cd8a87c017a0
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]:     user        root
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]:     group       root
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]:     maxconn     1024
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]:     pidfile     /var/lib/neutron/external/pids/cc98e8b1-8169-4a08-8b22-cd8a87c017a0.pid.haproxy
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]:     daemon
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]: 
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]: defaults
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]:     log global
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]:     mode http
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]:     option httplog
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]:     option dontlognull
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]:     option http-server-close
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]:     option forwardfor
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]:     retries                 3
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]:     timeout http-request    30s
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]:     timeout connect         30s
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]:     timeout client          32s
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]:     timeout server          32s
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]:     timeout http-keep-alive 30s
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]: 
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]: listen listener
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]:     bind 169.254.169.254:80
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]:     
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]:     server metadata /var/lib/neutron/metadata_proxy
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]: 
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]:     http-request add-header X-OVN-Network-ID cc98e8b1-8169-4a08-8b22-cd8a87c017a0
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Jan 26 15:37:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:37:06.909 104930 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0', 'env', 'PROCESS_TAG=haproxy-cc98e8b1-8169-4a08-8b22-cd8a87c017a0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/cc98e8b1-8169-4a08-8b22-cd8a87c017a0.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Jan 26 15:37:06 compute-1 nova_compute[183403]: 2026-01-26 15:37:06.926 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:37:07 compute-1 nova_compute[183403]: 2026-01-26 15:37:07.139 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:37:07 compute-1 nova_compute[183403]: 2026-01-26 15:37:07.234 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:37:07 compute-1 podman[214216]: 2026-01-26 15:37:07.308854615 +0000 UTC m=+0.054826436 container create 7e613202ca7ad6ea01b69c3aa74a8eea4e003054211e6bdff090cb72b7ec8bed (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260120, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.4)
Jan 26 15:37:07 compute-1 systemd[1]: Started libpod-conmon-7e613202ca7ad6ea01b69c3aa74a8eea4e003054211e6bdff090cb72b7ec8bed.scope.
Jan 26 15:37:07 compute-1 podman[214216]: 2026-01-26 15:37:07.277496403 +0000 UTC m=+0.023468255 image pull d5bf96c5225682608353c2a38183b39c74c7c48343b54a579b3b6f3d81996637 38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Jan 26 15:37:07 compute-1 systemd[1]: Started libcrun container.
Jan 26 15:37:07 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/806063821df49012ebc505ea3ce04c9947c8c17b3be388d1754d8012dd4d2102/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 15:37:07 compute-1 podman[214216]: 2026-01-26 15:37:07.509702248 +0000 UTC m=+0.255674089 container init 7e613202ca7ad6ea01b69c3aa74a8eea4e003054211e6bdff090cb72b7ec8bed (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260120)
Jan 26 15:37:07 compute-1 podman[214216]: 2026-01-26 15:37:07.517956525 +0000 UTC m=+0.263928346 container start 7e613202ca7ad6ea01b69c3aa74a8eea4e003054211e6bdff090cb72b7ec8bed (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Jan 26 15:37:07 compute-1 neutron-haproxy-ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0[214237]: [NOTICE]   (214249) : New worker (214251) forked
Jan 26 15:37:07 compute-1 neutron-haproxy-ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0[214237]: [NOTICE]   (214249) : Loading success.
Jan 26 15:37:09 compute-1 ovn_controller[95641]: 2026-01-26T15:37:09Z|00219|binding|INFO|Claiming lport fc6d9d4f-93a9-4b14-81b9-6d4d927635b5 for this chassis.
Jan 26 15:37:09 compute-1 ovn_controller[95641]: 2026-01-26T15:37:09Z|00220|binding|INFO|fc6d9d4f-93a9-4b14-81b9-6d4d927635b5: Claiming fa:16:3e:83:18:02 10.100.0.13
Jan 26 15:37:09 compute-1 ovn_controller[95641]: 2026-01-26T15:37:09Z|00221|binding|INFO|Setting lport fc6d9d4f-93a9-4b14-81b9-6d4d927635b5 up in Southbound
Jan 26 15:37:10 compute-1 nova_compute[183403]: 2026-01-26 15:37:10.710 183407 INFO nova.compute.manager [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 5f9467db-9e52-4153-bd8b-23e40544aee2] Post operation of migration started
Jan 26 15:37:10 compute-1 nova_compute[183403]: 2026-01-26 15:37:10.711 183407 WARNING neutronclient.v2_0.client [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:37:11 compute-1 nova_compute[183403]: 2026-01-26 15:37:11.365 183407 WARNING neutronclient.v2_0.client [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:37:11 compute-1 nova_compute[183403]: 2026-01-26 15:37:11.366 183407 WARNING neutronclient.v2_0.client [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:37:11 compute-1 nova_compute[183403]: 2026-01-26 15:37:11.458 183407 DEBUG oslo_concurrency.lockutils [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "refresh_cache-5f9467db-9e52-4153-bd8b-23e40544aee2" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 15:37:11 compute-1 nova_compute[183403]: 2026-01-26 15:37:11.459 183407 DEBUG oslo_concurrency.lockutils [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquired lock "refresh_cache-5f9467db-9e52-4153-bd8b-23e40544aee2" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 15:37:11 compute-1 nova_compute[183403]: 2026-01-26 15:37:11.460 183407 DEBUG nova.network.neutron [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 5f9467db-9e52-4153-bd8b-23e40544aee2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 15:37:11 compute-1 nova_compute[183403]: 2026-01-26 15:37:11.972 183407 WARNING neutronclient.v2_0.client [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:37:12 compute-1 nova_compute[183403]: 2026-01-26 15:37:12.141 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:37:12 compute-1 nova_compute[183403]: 2026-01-26 15:37:12.236 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:37:12 compute-1 nova_compute[183403]: 2026-01-26 15:37:12.690 183407 WARNING neutronclient.v2_0.client [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:37:12 compute-1 nova_compute[183403]: 2026-01-26 15:37:12.812 183407 DEBUG nova.network.neutron [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 5f9467db-9e52-4153-bd8b-23e40544aee2] Updating instance_info_cache with network_info: [{"id": "fc6d9d4f-93a9-4b14-81b9-6d4d927635b5", "address": "fa:16:3e:83:18:02", "network": {"id": "cc98e8b1-8169-4a08-8b22-cd8a87c017a0", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-1789164558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a18d09bfe0e7479c8a237dd032889317", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc6d9d4f-93", "ovs_interfaceid": "fc6d9d4f-93a9-4b14-81b9-6d4d927635b5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:37:13 compute-1 nova_compute[183403]: 2026-01-26 15:37:13.319 183407 DEBUG oslo_concurrency.lockutils [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Releasing lock "refresh_cache-5f9467db-9e52-4153-bd8b-23e40544aee2" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 15:37:13 compute-1 nova_compute[183403]: 2026-01-26 15:37:13.847 183407 DEBUG oslo_concurrency.lockutils [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:37:13 compute-1 nova_compute[183403]: 2026-01-26 15:37:13.848 183407 DEBUG oslo_concurrency.lockutils [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:37:13 compute-1 nova_compute[183403]: 2026-01-26 15:37:13.848 183407 DEBUG oslo_concurrency.lockutils [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:37:13 compute-1 nova_compute[183403]: 2026-01-26 15:37:13.853 183407 INFO nova.virt.libvirt.driver [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 5f9467db-9e52-4153-bd8b-23e40544aee2] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Jan 26 15:37:13 compute-1 virtqemud[183290]: Domain id=20 name='instance-0000001d' uuid=5f9467db-9e52-4153-bd8b-23e40544aee2 is tainted: custom-monitor
Jan 26 15:37:14 compute-1 nova_compute[183403]: 2026-01-26 15:37:14.861 183407 INFO nova.virt.libvirt.driver [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 5f9467db-9e52-4153-bd8b-23e40544aee2] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Jan 26 15:37:15 compute-1 nova_compute[183403]: 2026-01-26 15:37:15.869 183407 INFO nova.virt.libvirt.driver [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 5f9467db-9e52-4153-bd8b-23e40544aee2] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Jan 26 15:37:15 compute-1 nova_compute[183403]: 2026-01-26 15:37:15.875 183407 DEBUG nova.compute.manager [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 5f9467db-9e52-4153-bd8b-23e40544aee2] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Jan 26 15:37:16 compute-1 nova_compute[183403]: 2026-01-26 15:37:16.387 183407 DEBUG nova.objects.instance [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 5f9467db-9e52-4153-bd8b-23e40544aee2] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Jan 26 15:37:16 compute-1 podman[214261]: 2026-01-26 15:37:16.88705456 +0000 UTC m=+0.058299177 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260120, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 26 15:37:16 compute-1 podman[214260]: 2026-01-26 15:37:16.932160347 +0000 UTC m=+0.104751282 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260120, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 15:37:17 compute-1 nova_compute[183403]: 2026-01-26 15:37:17.152 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:37:17 compute-1 nova_compute[183403]: 2026-01-26 15:37:17.238 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:37:17 compute-1 nova_compute[183403]: 2026-01-26 15:37:17.410 183407 WARNING neutronclient.v2_0.client [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:37:18 compute-1 nova_compute[183403]: 2026-01-26 15:37:18.472 183407 WARNING neutronclient.v2_0.client [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:37:18 compute-1 nova_compute[183403]: 2026-01-26 15:37:18.473 183407 WARNING neutronclient.v2_0.client [None req-1f2bdc8c-7105-44c5-9458-799ee34487bf a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:37:19 compute-1 openstack_network_exporter[195610]: ERROR   15:37:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:37:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:37:19 compute-1 openstack_network_exporter[195610]: ERROR   15:37:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:37:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:37:21 compute-1 nova_compute[183403]: 2026-01-26 15:37:21.028 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:37:21 compute-1 nova_compute[183403]: 2026-01-26 15:37:21.538 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Triggering sync for uuid 5f9467db-9e52-4153-bd8b-23e40544aee2 _sync_power_states /usr/lib/python3.12/site-packages/nova/compute/manager.py:11024
Jan 26 15:37:21 compute-1 nova_compute[183403]: 2026-01-26 15:37:21.539 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "5f9467db-9e52-4153-bd8b-23e40544aee2" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:37:21 compute-1 nova_compute[183403]: 2026-01-26 15:37:21.540 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "5f9467db-9e52-4153-bd8b-23e40544aee2" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:37:22 compute-1 nova_compute[183403]: 2026-01-26 15:37:22.050 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "5f9467db-9e52-4153-bd8b-23e40544aee2" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.510s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:37:22 compute-1 nova_compute[183403]: 2026-01-26 15:37:22.154 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:37:22 compute-1 nova_compute[183403]: 2026-01-26 15:37:22.241 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:37:24 compute-1 nova_compute[183403]: 2026-01-26 15:37:24.089 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:37:27 compute-1 nova_compute[183403]: 2026-01-26 15:37:27.155 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:37:27 compute-1 nova_compute[183403]: 2026-01-26 15:37:27.243 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:37:27 compute-1 nova_compute[183403]: 2026-01-26 15:37:27.640 183407 DEBUG nova.compute.manager [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp3a33oorc',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='ddc6e2e7-fe6a-4589-a4e8-00138e842f1d',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9315
Jan 26 15:37:28 compute-1 nova_compute[183403]: 2026-01-26 15:37:28.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:37:28 compute-1 nova_compute[183403]: 2026-01-26 15:37:28.654 183407 DEBUG oslo_concurrency.lockutils [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "refresh_cache-ddc6e2e7-fe6a-4589-a4e8-00138e842f1d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 15:37:28 compute-1 nova_compute[183403]: 2026-01-26 15:37:28.655 183407 DEBUG oslo_concurrency.lockutils [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquired lock "refresh_cache-ddc6e2e7-fe6a-4589-a4e8-00138e842f1d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 15:37:28 compute-1 nova_compute[183403]: 2026-01-26 15:37:28.655 183407 DEBUG nova.network.neutron [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: ddc6e2e7-fe6a-4589-a4e8-00138e842f1d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 15:37:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:37:29.094 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:37:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:37:29.094 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:37:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:37:29.095 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:37:29 compute-1 nova_compute[183403]: 2026-01-26 15:37:29.165 183407 WARNING neutronclient.v2_0.client [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:37:29 compute-1 nova_compute[183403]: 2026-01-26 15:37:29.882 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:37:29 compute-1 nova_compute[183403]: 2026-01-26 15:37:29.883 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:37:29 compute-1 nova_compute[183403]: 2026-01-26 15:37:29.883 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:37:29 compute-1 nova_compute[183403]: 2026-01-26 15:37:29.884 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:37:30 compute-1 nova_compute[183403]: 2026-01-26 15:37:30.942 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5f9467db-9e52-4153-bd8b-23e40544aee2/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:37:31 compute-1 nova_compute[183403]: 2026-01-26 15:37:31.007 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5f9467db-9e52-4153-bd8b-23e40544aee2/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:37:31 compute-1 nova_compute[183403]: 2026-01-26 15:37:31.009 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5f9467db-9e52-4153-bd8b-23e40544aee2/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:37:31 compute-1 nova_compute[183403]: 2026-01-26 15:37:31.069 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5f9467db-9e52-4153-bd8b-23e40544aee2/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:37:31 compute-1 nova_compute[183403]: 2026-01-26 15:37:31.229 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:37:31 compute-1 nova_compute[183403]: 2026-01-26 15:37:31.230 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:37:31 compute-1 nova_compute[183403]: 2026-01-26 15:37:31.247 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.016s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:37:31 compute-1 nova_compute[183403]: 2026-01-26 15:37:31.247 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5696MB free_disk=73.11585235595703GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:37:31 compute-1 nova_compute[183403]: 2026-01-26 15:37:31.248 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:37:31 compute-1 nova_compute[183403]: 2026-01-26 15:37:31.248 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:37:32 compute-1 nova_compute[183403]: 2026-01-26 15:37:32.157 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:37:32 compute-1 nova_compute[183403]: 2026-01-26 15:37:32.244 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:37:32 compute-1 nova_compute[183403]: 2026-01-26 15:37:32.252 183407 WARNING neutronclient.v2_0.client [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:37:33 compute-1 nova_compute[183403]: 2026-01-26 15:37:33.082 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Migration for instance ddc6e2e7-fe6a-4589-a4e8-00138e842f1d refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Jan 26 15:37:33 compute-1 nova_compute[183403]: 2026-01-26 15:37:33.263 183407 DEBUG nova.network.neutron [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: ddc6e2e7-fe6a-4589-a4e8-00138e842f1d] Updating instance_info_cache with network_info: [{"id": "8152de5c-ac6c-45dd-8313-2c972cb67562", "address": "fa:16:3e:44:9b:fd", "network": {"id": "cc98e8b1-8169-4a08-8b22-cd8a87c017a0", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-1789164558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a18d09bfe0e7479c8a237dd032889317", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8152de5c-ac", "ovs_interfaceid": "8152de5c-ac6c-45dd-8313-2c972cb67562", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:37:33 compute-1 nova_compute[183403]: 2026-01-26 15:37:33.589 183407 INFO nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] [instance: ddc6e2e7-fe6a-4589-a4e8-00138e842f1d] Updating resource usage from migration 87f119f3-05a4-4748-8678-3c7fd8bd0737
Jan 26 15:37:33 compute-1 nova_compute[183403]: 2026-01-26 15:37:33.590 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] [instance: ddc6e2e7-fe6a-4589-a4e8-00138e842f1d] Starting to track incoming migration 87f119f3-05a4-4748-8678-3c7fd8bd0737 with flavor 74480e15-23e6-4569-8ef9-3ddf5ac8b981 _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Jan 26 15:37:33 compute-1 nova_compute[183403]: 2026-01-26 15:37:33.787 183407 DEBUG oslo_concurrency.lockutils [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Releasing lock "refresh_cache-ddc6e2e7-fe6a-4589-a4e8-00138e842f1d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 15:37:33 compute-1 nova_compute[183403]: 2026-01-26 15:37:33.812 183407 DEBUG nova.virt.libvirt.driver [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: ddc6e2e7-fe6a-4589-a4e8-00138e842f1d] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp3a33oorc',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='ddc6e2e7-fe6a-4589-a4e8-00138e842f1d',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Jan 26 15:37:33 compute-1 nova_compute[183403]: 2026-01-26 15:37:33.813 183407 DEBUG nova.virt.libvirt.driver [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: ddc6e2e7-fe6a-4589-a4e8-00138e842f1d] Creating instance directory: /var/lib/nova/instances/ddc6e2e7-fe6a-4589-a4e8-00138e842f1d pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Jan 26 15:37:33 compute-1 nova_compute[183403]: 2026-01-26 15:37:33.813 183407 DEBUG nova.virt.libvirt.driver [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: ddc6e2e7-fe6a-4589-a4e8-00138e842f1d] Creating disk.info with the contents: {'/var/lib/nova/instances/ddc6e2e7-fe6a-4589-a4e8-00138e842f1d/disk': 'qcow2', '/var/lib/nova/instances/ddc6e2e7-fe6a-4589-a4e8-00138e842f1d/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Jan 26 15:37:33 compute-1 nova_compute[183403]: 2026-01-26 15:37:33.813 183407 DEBUG nova.virt.libvirt.driver [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: ddc6e2e7-fe6a-4589-a4e8-00138e842f1d] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Jan 26 15:37:33 compute-1 nova_compute[183403]: 2026-01-26 15:37:33.814 183407 DEBUG nova.objects.instance [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lazy-loading 'trusted_certs' on Instance uuid ddc6e2e7-fe6a-4589-a4e8-00138e842f1d obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:37:34 compute-1 nova_compute[183403]: 2026-01-26 15:37:34.322 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:37:34 compute-1 nova_compute[183403]: 2026-01-26 15:37:34.325 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:37:34 compute-1 nova_compute[183403]: 2026-01-26 15:37:34.326 183407 DEBUG oslo_concurrency.processutils [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:37:34 compute-1 nova_compute[183403]: 2026-01-26 15:37:34.389 183407 DEBUG oslo_concurrency.processutils [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:37:34 compute-1 nova_compute[183403]: 2026-01-26 15:37:34.391 183407 DEBUG oslo_concurrency.lockutils [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:37:34 compute-1 nova_compute[183403]: 2026-01-26 15:37:34.392 183407 DEBUG oslo_concurrency.lockutils [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:37:34 compute-1 nova_compute[183403]: 2026-01-26 15:37:34.392 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:37:34 compute-1 nova_compute[183403]: 2026-01-26 15:37:34.395 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:37:34 compute-1 nova_compute[183403]: 2026-01-26 15:37:34.395 183407 DEBUG oslo_concurrency.processutils [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:37:34 compute-1 nova_compute[183403]: 2026-01-26 15:37:34.454 183407 DEBUG oslo_concurrency.processutils [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:37:34 compute-1 nova_compute[183403]: 2026-01-26 15:37:34.456 183407 DEBUG oslo_concurrency.processutils [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0,backing_fmt=raw /var/lib/nova/instances/ddc6e2e7-fe6a-4589-a4e8-00138e842f1d/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:37:34 compute-1 nova_compute[183403]: 2026-01-26 15:37:34.740 183407 WARNING nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Instance ddc6e2e7-fe6a-4589-a4e8-00138e842f1d has been moved to another host compute-0.ctlplane.example.com(compute-0.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Jan 26 15:37:34 compute-1 nova_compute[183403]: 2026-01-26 15:37:34.741 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Instance 5f9467db-9e52-4153-bd8b-23e40544aee2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 15:37:34 compute-1 nova_compute[183403]: 2026-01-26 15:37:34.742 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:37:34 compute-1 nova_compute[183403]: 2026-01-26 15:37:34.742 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:37:31 up  1:32,  0 user,  load average: 0.10, 0.13, 0.18\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_8f7ddd8ab2ae4841a1f43ae8078bb924': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:37:34 compute-1 nova_compute[183403]: 2026-01-26 15:37:34.823 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Refreshing inventories for resource provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Jan 26 15:37:34 compute-1 nova_compute[183403]: 2026-01-26 15:37:34.884 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Updating ProviderTree inventory for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Jan 26 15:37:34 compute-1 nova_compute[183403]: 2026-01-26 15:37:34.884 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Updating inventory in ProviderTree for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Jan 26 15:37:34 compute-1 nova_compute[183403]: 2026-01-26 15:37:34.899 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Refreshing aggregate associations for resource provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Jan 26 15:37:34 compute-1 nova_compute[183403]: 2026-01-26 15:37:34.916 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Refreshing trait associations for resource provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67, traits: COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE41,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NODE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_MMX,COMPUTE_ACCELERATORS,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_ARCH_X86_64,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE2,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_SOUND_MODEL_USB,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_TIS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_DEVICE_TAGGING,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_CRB,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_SOUND_MODEL_SB16,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_QCOW2,HW_ARCH_X86_64,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SECURITY_TPM_2_0 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Jan 26 15:37:34 compute-1 nova_compute[183403]: 2026-01-26 15:37:34.970 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:37:35 compute-1 nova_compute[183403]: 2026-01-26 15:37:35.075 183407 DEBUG oslo_concurrency.processutils [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0,backing_fmt=raw /var/lib/nova/instances/ddc6e2e7-fe6a-4589-a4e8-00138e842f1d/disk 1073741824" returned: 0 in 0.619s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:37:35 compute-1 nova_compute[183403]: 2026-01-26 15:37:35.076 183407 DEBUG oslo_concurrency.lockutils [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.684s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:37:35 compute-1 nova_compute[183403]: 2026-01-26 15:37:35.077 183407 DEBUG oslo_concurrency.processutils [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:37:35 compute-1 nova_compute[183403]: 2026-01-26 15:37:35.132 183407 DEBUG oslo_concurrency.processutils [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:37:35 compute-1 nova_compute[183403]: 2026-01-26 15:37:35.134 183407 DEBUG nova.virt.disk.api [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Checking if we can resize image /var/lib/nova/instances/ddc6e2e7-fe6a-4589-a4e8-00138e842f1d/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 26 15:37:35 compute-1 nova_compute[183403]: 2026-01-26 15:37:35.135 183407 DEBUG oslo_concurrency.processutils [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ddc6e2e7-fe6a-4589-a4e8-00138e842f1d/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:37:35 compute-1 nova_compute[183403]: 2026-01-26 15:37:35.185 183407 DEBUG oslo_concurrency.processutils [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ddc6e2e7-fe6a-4589-a4e8-00138e842f1d/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:37:35 compute-1 nova_compute[183403]: 2026-01-26 15:37:35.186 183407 DEBUG nova.virt.disk.api [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Cannot resize image /var/lib/nova/instances/ddc6e2e7-fe6a-4589-a4e8-00138e842f1d/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 26 15:37:35 compute-1 nova_compute[183403]: 2026-01-26 15:37:35.187 183407 DEBUG nova.objects.instance [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lazy-loading 'migration_context' on Instance uuid ddc6e2e7-fe6a-4589-a4e8-00138e842f1d obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:37:35 compute-1 nova_compute[183403]: 2026-01-26 15:37:35.480 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:37:35 compute-1 podman[192725]: time="2026-01-26T15:37:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:37:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:37:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16575 "" "Go-http-client/1.1"
Jan 26 15:37:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:37:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2660 "" "Go-http-client/1.1"
Jan 26 15:37:35 compute-1 nova_compute[183403]: 2026-01-26 15:37:35.695 183407 DEBUG nova.objects.base [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Object Instance<ddc6e2e7-fe6a-4589-a4e8-00138e842f1d> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Jan 26 15:37:35 compute-1 nova_compute[183403]: 2026-01-26 15:37:35.696 183407 DEBUG oslo_concurrency.processutils [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/ddc6e2e7-fe6a-4589-a4e8-00138e842f1d/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:37:35 compute-1 nova_compute[183403]: 2026-01-26 15:37:35.726 183407 DEBUG oslo_concurrency.processutils [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/ddc6e2e7-fe6a-4589-a4e8-00138e842f1d/disk.config 497664" returned: 0 in 0.030s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:37:35 compute-1 nova_compute[183403]: 2026-01-26 15:37:35.728 183407 DEBUG nova.virt.libvirt.driver [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: ddc6e2e7-fe6a-4589-a4e8-00138e842f1d] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Jan 26 15:37:35 compute-1 nova_compute[183403]: 2026-01-26 15:37:35.730 183407 DEBUG nova.virt.libvirt.vif [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-26T15:35:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-1988940178',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-1988940',id=28,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T15:36:03Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='8f7ddd8ab2ae4841a1f43ae8078bb924',ramdisk_id='',reservation_id='r-a8nl1jf1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,reader,admin,member',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-1156482628',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-1156482628-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T15:36:03Z,user_data=None,user_id='8152e350b54f44cabafc751c752d6f92',uuid=ddc6e2e7-fe6a-4589-a4e8-00138e842f1d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8152de5c-ac6c-45dd-8313-2c972cb67562", "address": "fa:16:3e:44:9b:fd", "network": {"id": "cc98e8b1-8169-4a08-8b22-cd8a87c017a0", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-1789164558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a18d09bfe0e7479c8a237dd032889317", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap8152de5c-ac", "ovs_interfaceid": "8152de5c-ac6c-45dd-8313-2c972cb67562", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 26 15:37:35 compute-1 nova_compute[183403]: 2026-01-26 15:37:35.731 183407 DEBUG nova.network.os_vif_util [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Converting VIF {"id": "8152de5c-ac6c-45dd-8313-2c972cb67562", "address": "fa:16:3e:44:9b:fd", "network": {"id": "cc98e8b1-8169-4a08-8b22-cd8a87c017a0", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-1789164558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a18d09bfe0e7479c8a237dd032889317", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap8152de5c-ac", "ovs_interfaceid": "8152de5c-ac6c-45dd-8313-2c972cb67562", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:37:35 compute-1 nova_compute[183403]: 2026-01-26 15:37:35.732 183407 DEBUG nova.network.os_vif_util [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:44:9b:fd,bridge_name='br-int',has_traffic_filtering=True,id=8152de5c-ac6c-45dd-8313-2c972cb67562,network=Network(cc98e8b1-8169-4a08-8b22-cd8a87c017a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8152de5c-ac') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:37:35 compute-1 nova_compute[183403]: 2026-01-26 15:37:35.733 183407 DEBUG os_vif [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:44:9b:fd,bridge_name='br-int',has_traffic_filtering=True,id=8152de5c-ac6c-45dd-8313-2c972cb67562,network=Network(cc98e8b1-8169-4a08-8b22-cd8a87c017a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8152de5c-ac') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 26 15:37:35 compute-1 nova_compute[183403]: 2026-01-26 15:37:35.734 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:37:35 compute-1 nova_compute[183403]: 2026-01-26 15:37:35.734 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:37:35 compute-1 nova_compute[183403]: 2026-01-26 15:37:35.735 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:37:35 compute-1 nova_compute[183403]: 2026-01-26 15:37:35.736 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:37:35 compute-1 nova_compute[183403]: 2026-01-26 15:37:35.737 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '367c52bd-bcb0-58ab-a8a2-1b54217ec748', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:37:35 compute-1 nova_compute[183403]: 2026-01-26 15:37:35.738 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:37:35 compute-1 nova_compute[183403]: 2026-01-26 15:37:35.741 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:37:35 compute-1 nova_compute[183403]: 2026-01-26 15:37:35.743 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:37:35 compute-1 nova_compute[183403]: 2026-01-26 15:37:35.744 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8152de5c-ac, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:37:35 compute-1 nova_compute[183403]: 2026-01-26 15:37:35.744 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap8152de5c-ac, col_values=(('qos', UUID('a496c152-c7f2-42b4-8fc8-4b8a34c04391')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:37:35 compute-1 nova_compute[183403]: 2026-01-26 15:37:35.744 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap8152de5c-ac, col_values=(('external_ids', {'iface-id': '8152de5c-ac6c-45dd-8313-2c972cb67562', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:44:9b:fd', 'vm-uuid': 'ddc6e2e7-fe6a-4589-a4e8-00138e842f1d'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:37:35 compute-1 nova_compute[183403]: 2026-01-26 15:37:35.745 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:37:35 compute-1 NetworkManager[55716]: <info>  [1769441855.7465] manager: (tap8152de5c-ac): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/85)
Jan 26 15:37:35 compute-1 nova_compute[183403]: 2026-01-26 15:37:35.747 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:37:35 compute-1 nova_compute[183403]: 2026-01-26 15:37:35.751 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:37:35 compute-1 nova_compute[183403]: 2026-01-26 15:37:35.752 183407 INFO os_vif [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:44:9b:fd,bridge_name='br-int',has_traffic_filtering=True,id=8152de5c-ac6c-45dd-8313-2c972cb67562,network=Network(cc98e8b1-8169-4a08-8b22-cd8a87c017a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8152de5c-ac')
Jan 26 15:37:35 compute-1 nova_compute[183403]: 2026-01-26 15:37:35.753 183407 DEBUG nova.virt.libvirt.driver [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Jan 26 15:37:35 compute-1 nova_compute[183403]: 2026-01-26 15:37:35.753 183407 DEBUG nova.compute.manager [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp3a33oorc',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='ddc6e2e7-fe6a-4589-a4e8-00138e842f1d',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9381
Jan 26 15:37:35 compute-1 nova_compute[183403]: 2026-01-26 15:37:35.754 183407 WARNING neutronclient.v2_0.client [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:37:35 compute-1 nova_compute[183403]: 2026-01-26 15:37:35.852 183407 WARNING neutronclient.v2_0.client [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:37:35 compute-1 nova_compute[183403]: 2026-01-26 15:37:35.991 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:37:35 compute-1 nova_compute[183403]: 2026-01-26 15:37:35.991 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.743s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:37:36 compute-1 podman[214344]: 2026-01-26 15:37:36.931442513 +0000 UTC m=+0.097213866 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.openshift.tags=minimal rhel9, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 26 15:37:36 compute-1 podman[214343]: 2026-01-26 15:37:36.948515904 +0000 UTC m=+0.118903139 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 26 15:37:36 compute-1 nova_compute[183403]: 2026-01-26 15:37:36.991 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:37:36 compute-1 nova_compute[183403]: 2026-01-26 15:37:36.991 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:37:36 compute-1 nova_compute[183403]: 2026-01-26 15:37:36.992 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:37:36 compute-1 nova_compute[183403]: 2026-01-26 15:37:36.992 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:37:36 compute-1 nova_compute[183403]: 2026-01-26 15:37:36.992 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:37:36 compute-1 nova_compute[183403]: 2026-01-26 15:37:36.993 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:37:36 compute-1 nova_compute[183403]: 2026-01-26 15:37:36.993 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 15:37:37 compute-1 nova_compute[183403]: 2026-01-26 15:37:37.159 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:37:39 compute-1 ovn_controller[95641]: 2026-01-26T15:37:39Z|00222|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 26 15:37:40 compute-1 nova_compute[183403]: 2026-01-26 15:37:40.747 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:37:42 compute-1 nova_compute[183403]: 2026-01-26 15:37:42.161 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:37:43 compute-1 nova_compute[183403]: 2026-01-26 15:37:43.227 183407 DEBUG nova.network.neutron [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: ddc6e2e7-fe6a-4589-a4e8-00138e842f1d] Port 8152de5c-ac6c-45dd-8313-2c972cb67562 updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Jan 26 15:37:43 compute-1 nova_compute[183403]: 2026-01-26 15:37:43.240 183407 DEBUG nova.compute.manager [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp3a33oorc',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='ddc6e2e7-fe6a-4589-a4e8-00138e842f1d',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9447
Jan 26 15:37:45 compute-1 nova_compute[183403]: 2026-01-26 15:37:45.751 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:37:45 compute-1 NetworkManager[55716]: <info>  [1769441865.8369] manager: (tap8152de5c-ac): new Tun device (/org/freedesktop/NetworkManager/Devices/86)
Jan 26 15:37:45 compute-1 kernel: tap8152de5c-ac: entered promiscuous mode
Jan 26 15:37:45 compute-1 ovn_controller[95641]: 2026-01-26T15:37:45Z|00223|binding|INFO|Claiming lport 8152de5c-ac6c-45dd-8313-2c972cb67562 for this additional chassis.
Jan 26 15:37:45 compute-1 ovn_controller[95641]: 2026-01-26T15:37:45Z|00224|binding|INFO|8152de5c-ac6c-45dd-8313-2c972cb67562: Claiming fa:16:3e:44:9b:fd 10.100.0.9
Jan 26 15:37:45 compute-1 nova_compute[183403]: 2026-01-26 15:37:45.840 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:37:45 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:37:45.848 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:44:9b:fd 10.100.0.9'], port_security=['fa:16:3e:44:9b:fd 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'ddc6e2e7-fe6a-4589-a4e8-00138e842f1d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc98e8b1-8169-4a08-8b22-cd8a87c017a0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f7ddd8ab2ae4841a1f43ae8078bb924', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'c563489a-e307-4381-b22d-8f22c6dbbfd6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=76661f47-b7c7-4131-9a1a-0f8828404115, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=8152de5c-ac6c-45dd-8313-2c972cb67562) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:37:45 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:37:45.848 104930 INFO neutron.agent.ovn.metadata.agent [-] Port 8152de5c-ac6c-45dd-8313-2c972cb67562 in datapath cc98e8b1-8169-4a08-8b22-cd8a87c017a0 unbound from our chassis
Jan 26 15:37:45 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:37:45.850 104930 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network cc98e8b1-8169-4a08-8b22-cd8a87c017a0
Jan 26 15:37:45 compute-1 ovn_controller[95641]: 2026-01-26T15:37:45Z|00225|binding|INFO|Setting lport 8152de5c-ac6c-45dd-8313-2c972cb67562 ovn-installed in OVS
Jan 26 15:37:45 compute-1 nova_compute[183403]: 2026-01-26 15:37:45.866 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:37:45 compute-1 systemd-udevd[214399]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 15:37:45 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:37:45.867 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[c343793d-a8a2-47f5-a1af-9121d4bf09d7]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:37:45 compute-1 nova_compute[183403]: 2026-01-26 15:37:45.869 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:37:45 compute-1 NetworkManager[55716]: <info>  [1769441865.8798] device (tap8152de5c-ac): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 15:37:45 compute-1 NetworkManager[55716]: <info>  [1769441865.8809] device (tap8152de5c-ac): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 15:37:45 compute-1 systemd-machined[154697]: New machine qemu-21-instance-0000001c.
Jan 26 15:37:45 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:37:45.895 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[78c1700c-0f6a-47d1-9baa-432520f9019c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:37:45 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:37:45.898 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[dfba71b8-0cb3-4a06-a0a7-916e8bad0340]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:37:45 compute-1 systemd[1]: Started Virtual Machine qemu-21-instance-0000001c.
Jan 26 15:37:45 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:37:45.921 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[d2ffd518-598b-4c77-8368-755629019625]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:37:45 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:37:45.938 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[39496e9a-952b-44e8-8878-195635e1ab1f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcc98e8b1-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:92:f7:3a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 26, 'tx_packets': 5, 'rx_bytes': 1372, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 26, 'tx_packets': 5, 'rx_bytes': 1372, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 61], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 554932, 'reachable_time': 37014, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214409, 'error': None, 'target': 'ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:37:45 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:37:45.955 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[6348fb32-4225-4404-ae7a-bea509ad14a6]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapcc98e8b1-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 554945, 'tstamp': 554945}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214413, 'error': None, 'target': 'ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapcc98e8b1-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 554948, 'tstamp': 554948}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214413, 'error': None, 'target': 'ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:37:45 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:37:45.956 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcc98e8b1-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:37:45 compute-1 nova_compute[183403]: 2026-01-26 15:37:45.959 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:37:45 compute-1 nova_compute[183403]: 2026-01-26 15:37:45.961 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:37:45 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:37:45.961 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcc98e8b1-80, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:37:45 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:37:45.961 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:37:45 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:37:45.962 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapcc98e8b1-80, col_values=(('external_ids', {'iface-id': '219edf33-3765-4d7c-87b5-4ab0ed1d6a8a'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:37:45 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:37:45.962 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:37:45 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:37:45.964 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[fae8931d-ce83-4ed8-81da-96a35279950c]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-cc98e8b1-8169-4a08-8b22-cd8a87c017a0\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/cc98e8b1-8169-4a08-8b22-cd8a87c017a0.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID cc98e8b1-8169-4a08-8b22-cd8a87c017a0\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:37:47 compute-1 nova_compute[183403]: 2026-01-26 15:37:47.163 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:37:47 compute-1 podman[214440]: 2026-01-26 15:37:47.904274802 +0000 UTC m=+0.063004781 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2)
Jan 26 15:37:47 compute-1 podman[214439]: 2026-01-26 15:37:47.989750237 +0000 UTC m=+0.156186724 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, tcib_managed=true)
Jan 26 15:37:49 compute-1 openstack_network_exporter[195610]: ERROR   15:37:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:37:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:37:49 compute-1 openstack_network_exporter[195610]: ERROR   15:37:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:37:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:37:49 compute-1 ovn_controller[95641]: 2026-01-26T15:37:49Z|00226|binding|INFO|Claiming lport 8152de5c-ac6c-45dd-8313-2c972cb67562 for this chassis.
Jan 26 15:37:49 compute-1 ovn_controller[95641]: 2026-01-26T15:37:49Z|00227|binding|INFO|8152de5c-ac6c-45dd-8313-2c972cb67562: Claiming fa:16:3e:44:9b:fd 10.100.0.9
Jan 26 15:37:49 compute-1 ovn_controller[95641]: 2026-01-26T15:37:49Z|00228|binding|INFO|Setting lport 8152de5c-ac6c-45dd-8313-2c972cb67562 up in Southbound
Jan 26 15:37:50 compute-1 nova_compute[183403]: 2026-01-26 15:37:50.753 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:37:50 compute-1 nova_compute[183403]: 2026-01-26 15:37:50.754 183407 INFO nova.compute.manager [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: ddc6e2e7-fe6a-4589-a4e8-00138e842f1d] Post operation of migration started
Jan 26 15:37:50 compute-1 nova_compute[183403]: 2026-01-26 15:37:50.755 183407 WARNING neutronclient.v2_0.client [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:37:51 compute-1 nova_compute[183403]: 2026-01-26 15:37:51.392 183407 WARNING neutronclient.v2_0.client [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:37:51 compute-1 nova_compute[183403]: 2026-01-26 15:37:51.393 183407 WARNING neutronclient.v2_0.client [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:37:51 compute-1 nova_compute[183403]: 2026-01-26 15:37:51.462 183407 DEBUG oslo_concurrency.lockutils [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "refresh_cache-ddc6e2e7-fe6a-4589-a4e8-00138e842f1d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 15:37:51 compute-1 nova_compute[183403]: 2026-01-26 15:37:51.463 183407 DEBUG oslo_concurrency.lockutils [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquired lock "refresh_cache-ddc6e2e7-fe6a-4589-a4e8-00138e842f1d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 15:37:51 compute-1 nova_compute[183403]: 2026-01-26 15:37:51.464 183407 DEBUG nova.network.neutron [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: ddc6e2e7-fe6a-4589-a4e8-00138e842f1d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 15:37:51 compute-1 nova_compute[183403]: 2026-01-26 15:37:51.972 183407 WARNING neutronclient.v2_0.client [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:37:52 compute-1 nova_compute[183403]: 2026-01-26 15:37:52.165 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:37:52 compute-1 nova_compute[183403]: 2026-01-26 15:37:52.360 183407 WARNING neutronclient.v2_0.client [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:37:52 compute-1 nova_compute[183403]: 2026-01-26 15:37:52.567 183407 DEBUG nova.network.neutron [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: ddc6e2e7-fe6a-4589-a4e8-00138e842f1d] Updating instance_info_cache with network_info: [{"id": "8152de5c-ac6c-45dd-8313-2c972cb67562", "address": "fa:16:3e:44:9b:fd", "network": {"id": "cc98e8b1-8169-4a08-8b22-cd8a87c017a0", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-1789164558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a18d09bfe0e7479c8a237dd032889317", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8152de5c-ac", "ovs_interfaceid": "8152de5c-ac6c-45dd-8313-2c972cb67562", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:37:53 compute-1 nova_compute[183403]: 2026-01-26 15:37:53.075 183407 DEBUG oslo_concurrency.lockutils [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Releasing lock "refresh_cache-ddc6e2e7-fe6a-4589-a4e8-00138e842f1d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 15:37:53 compute-1 nova_compute[183403]: 2026-01-26 15:37:53.861 183407 DEBUG oslo_concurrency.lockutils [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:37:53 compute-1 nova_compute[183403]: 2026-01-26 15:37:53.862 183407 DEBUG oslo_concurrency.lockutils [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:37:53 compute-1 nova_compute[183403]: 2026-01-26 15:37:53.862 183407 DEBUG oslo_concurrency.lockutils [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:37:53 compute-1 nova_compute[183403]: 2026-01-26 15:37:53.867 183407 INFO nova.virt.libvirt.driver [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: ddc6e2e7-fe6a-4589-a4e8-00138e842f1d] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Jan 26 15:37:53 compute-1 virtqemud[183290]: Domain id=21 name='instance-0000001c' uuid=ddc6e2e7-fe6a-4589-a4e8-00138e842f1d is tainted: custom-monitor
Jan 26 15:37:54 compute-1 nova_compute[183403]: 2026-01-26 15:37:54.872 183407 INFO nova.virt.libvirt.driver [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: ddc6e2e7-fe6a-4589-a4e8-00138e842f1d] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Jan 26 15:37:55 compute-1 nova_compute[183403]: 2026-01-26 15:37:55.756 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:37:55 compute-1 nova_compute[183403]: 2026-01-26 15:37:55.881 183407 INFO nova.virt.libvirt.driver [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: ddc6e2e7-fe6a-4589-a4e8-00138e842f1d] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Jan 26 15:37:55 compute-1 nova_compute[183403]: 2026-01-26 15:37:55.888 183407 DEBUG nova.compute.manager [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: ddc6e2e7-fe6a-4589-a4e8-00138e842f1d] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Jan 26 15:37:56 compute-1 nova_compute[183403]: 2026-01-26 15:37:56.399 183407 DEBUG nova.objects.instance [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: ddc6e2e7-fe6a-4589-a4e8-00138e842f1d] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Jan 26 15:37:57 compute-1 nova_compute[183403]: 2026-01-26 15:37:57.167 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:37:57 compute-1 nova_compute[183403]: 2026-01-26 15:37:57.445 183407 WARNING neutronclient.v2_0.client [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:37:58 compute-1 nova_compute[183403]: 2026-01-26 15:37:58.378 183407 WARNING neutronclient.v2_0.client [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:37:58 compute-1 nova_compute[183403]: 2026-01-26 15:37:58.379 183407 WARNING neutronclient.v2_0.client [None req-0ea1f65d-250e-4c62-bbf0-8331bf554056 a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:38:00 compute-1 nova_compute[183403]: 2026-01-26 15:38:00.759 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:38:02 compute-1 nova_compute[183403]: 2026-01-26 15:38:02.170 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:38:02 compute-1 nova_compute[183403]: 2026-01-26 15:38:02.845 183407 DEBUG oslo_concurrency.lockutils [None req-52fe5e41-9e87-476e-83d3-07a01a3a6589 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Acquiring lock "5f9467db-9e52-4153-bd8b-23e40544aee2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:38:02 compute-1 nova_compute[183403]: 2026-01-26 15:38:02.846 183407 DEBUG oslo_concurrency.lockutils [None req-52fe5e41-9e87-476e-83d3-07a01a3a6589 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Lock "5f9467db-9e52-4153-bd8b-23e40544aee2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:38:02 compute-1 nova_compute[183403]: 2026-01-26 15:38:02.847 183407 DEBUG oslo_concurrency.lockutils [None req-52fe5e41-9e87-476e-83d3-07a01a3a6589 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Acquiring lock "5f9467db-9e52-4153-bd8b-23e40544aee2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:38:02 compute-1 nova_compute[183403]: 2026-01-26 15:38:02.847 183407 DEBUG oslo_concurrency.lockutils [None req-52fe5e41-9e87-476e-83d3-07a01a3a6589 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Lock "5f9467db-9e52-4153-bd8b-23e40544aee2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:38:02 compute-1 nova_compute[183403]: 2026-01-26 15:38:02.848 183407 DEBUG oslo_concurrency.lockutils [None req-52fe5e41-9e87-476e-83d3-07a01a3a6589 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Lock "5f9467db-9e52-4153-bd8b-23e40544aee2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:38:02 compute-1 nova_compute[183403]: 2026-01-26 15:38:02.864 183407 INFO nova.compute.manager [None req-52fe5e41-9e87-476e-83d3-07a01a3a6589 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] [instance: 5f9467db-9e52-4153-bd8b-23e40544aee2] Terminating instance
Jan 26 15:38:03 compute-1 nova_compute[183403]: 2026-01-26 15:38:03.385 183407 DEBUG nova.compute.manager [None req-52fe5e41-9e87-476e-83d3-07a01a3a6589 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] [instance: 5f9467db-9e52-4153-bd8b-23e40544aee2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Jan 26 15:38:03 compute-1 kernel: tapfc6d9d4f-93 (unregistering): left promiscuous mode
Jan 26 15:38:03 compute-1 NetworkManager[55716]: <info>  [1769441883.4466] device (tapfc6d9d4f-93): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 15:38:03 compute-1 ovn_controller[95641]: 2026-01-26T15:38:03Z|00229|binding|INFO|Releasing lport fc6d9d4f-93a9-4b14-81b9-6d4d927635b5 from this chassis (sb_readonly=0)
Jan 26 15:38:03 compute-1 ovn_controller[95641]: 2026-01-26T15:38:03Z|00230|binding|INFO|Setting lport fc6d9d4f-93a9-4b14-81b9-6d4d927635b5 down in Southbound
Jan 26 15:38:03 compute-1 nova_compute[183403]: 2026-01-26 15:38:03.462 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:38:03 compute-1 ovn_controller[95641]: 2026-01-26T15:38:03Z|00231|binding|INFO|Removing iface tapfc6d9d4f-93 ovn-installed in OVS
Jan 26 15:38:03 compute-1 nova_compute[183403]: 2026-01-26 15:38:03.465 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:38:03 compute-1 nova_compute[183403]: 2026-01-26 15:38:03.479 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:38:03 compute-1 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d0000001d.scope: Deactivated successfully.
Jan 26 15:38:03 compute-1 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d0000001d.scope: Consumed 3.856s CPU time.
Jan 26 15:38:03 compute-1 systemd-machined[154697]: Machine qemu-20-instance-0000001d terminated.
Jan 26 15:38:03 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:38:03.556 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:83:18:02 10.100.0.13'], port_security=['fa:16:3e:83:18:02 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '5f9467db-9e52-4153-bd8b-23e40544aee2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc98e8b1-8169-4a08-8b22-cd8a87c017a0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f7ddd8ab2ae4841a1f43ae8078bb924', 'neutron:revision_number': '15', 'neutron:security_group_ids': 'c563489a-e307-4381-b22d-8f22c6dbbfd6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=76661f47-b7c7-4131-9a1a-0f8828404115, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], logical_port=fc6d9d4f-93a9-4b14-81b9-6d4d927635b5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:38:03 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:38:03.557 104930 INFO neutron.agent.ovn.metadata.agent [-] Port fc6d9d4f-93a9-4b14-81b9-6d4d927635b5 in datapath cc98e8b1-8169-4a08-8b22-cd8a87c017a0 unbound from our chassis
Jan 26 15:38:03 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:38:03.559 104930 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network cc98e8b1-8169-4a08-8b22-cd8a87c017a0
Jan 26 15:38:03 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:38:03.586 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[affa8c10-6563-4b9d-afce-b8672ed19f9c]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:38:03 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:38:03.627 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[2e56c9ac-436f-4602-bae1-4d6a5c29d44e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:38:03 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:38:03.632 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[20f927d9-6a7d-4c67-965c-7413a12736f5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:38:03 compute-1 nova_compute[183403]: 2026-01-26 15:38:03.663 183407 INFO nova.virt.libvirt.driver [-] [instance: 5f9467db-9e52-4153-bd8b-23e40544aee2] Instance destroyed successfully.
Jan 26 15:38:03 compute-1 nova_compute[183403]: 2026-01-26 15:38:03.664 183407 DEBUG nova.objects.instance [None req-52fe5e41-9e87-476e-83d3-07a01a3a6589 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Lazy-loading 'resources' on Instance uuid 5f9467db-9e52-4153-bd8b-23e40544aee2 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:38:03 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:38:03.670 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[d9d3a3c4-4d2a-4cea-8fca-6f59896255cf]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:38:03 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:38:03.685 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[a7f739cb-c9eb-4e04-83de-baa24efa0b8a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcc98e8b1-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:92:f7:3a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 46, 'tx_packets': 7, 'rx_bytes': 2212, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 46, 'tx_packets': 7, 'rx_bytes': 2212, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 61], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 554932, 'reachable_time': 37014, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214513, 'error': None, 'target': 'ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:38:03 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:38:03.699 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[c18820d5-0785-44dc-b011-870a349c4301]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapcc98e8b1-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 554945, 'tstamp': 554945}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214514, 'error': None, 'target': 'ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapcc98e8b1-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 554948, 'tstamp': 554948}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214514, 'error': None, 'target': 'ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:38:03 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:38:03.701 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcc98e8b1-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:38:03 compute-1 nova_compute[183403]: 2026-01-26 15:38:03.702 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:38:03 compute-1 nova_compute[183403]: 2026-01-26 15:38:03.706 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:38:03 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:38:03.706 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcc98e8b1-80, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:38:03 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:38:03.707 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:38:03 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:38:03.708 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapcc98e8b1-80, col_values=(('external_ids', {'iface-id': '219edf33-3765-4d7c-87b5-4ab0ed1d6a8a'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:38:03 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:38:03.708 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:38:03 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:38:03.709 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[08e57213-615c-49d0-8901-eccda40333b5]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-cc98e8b1-8169-4a08-8b22-cd8a87c017a0\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/cc98e8b1-8169-4a08-8b22-cd8a87c017a0.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID cc98e8b1-8169-4a08-8b22-cd8a87c017a0\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:38:04 compute-1 nova_compute[183403]: 2026-01-26 15:38:04.184 183407 DEBUG nova.virt.libvirt.vif [None req-52fe5e41-9e87-476e-83d3-07a01a3a6589 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2026-01-26T15:36:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-117472076',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-1174720',id=29,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T15:36:22Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8f7ddd8ab2ae4841a1f43ae8078bb924',ramdisk_id='',reservation_id='r-lt5qszy7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,reader,admin,member',clean_attempts='1',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-1156482628',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-1156482628-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T15:37:16Z,user_data=None,user_id='8152e350b54f44cabafc751c752d6f92',uuid=5f9467db-9e52-4153-bd8b-23e40544aee2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fc6d9d4f-93a9-4b14-81b9-6d4d927635b5", "address": "fa:16:3e:83:18:02", "network": {"id": "cc98e8b1-8169-4a08-8b22-cd8a87c017a0", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-1789164558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a18d09bfe0e7479c8a237dd032889317", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc6d9d4f-93", "ovs_interfaceid": "fc6d9d4f-93a9-4b14-81b9-6d4d927635b5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 26 15:38:04 compute-1 nova_compute[183403]: 2026-01-26 15:38:04.184 183407 DEBUG nova.network.os_vif_util [None req-52fe5e41-9e87-476e-83d3-07a01a3a6589 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Converting VIF {"id": "fc6d9d4f-93a9-4b14-81b9-6d4d927635b5", "address": "fa:16:3e:83:18:02", "network": {"id": "cc98e8b1-8169-4a08-8b22-cd8a87c017a0", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-1789164558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a18d09bfe0e7479c8a237dd032889317", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc6d9d4f-93", "ovs_interfaceid": "fc6d9d4f-93a9-4b14-81b9-6d4d927635b5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:38:04 compute-1 nova_compute[183403]: 2026-01-26 15:38:04.185 183407 DEBUG nova.network.os_vif_util [None req-52fe5e41-9e87-476e-83d3-07a01a3a6589 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:83:18:02,bridge_name='br-int',has_traffic_filtering=True,id=fc6d9d4f-93a9-4b14-81b9-6d4d927635b5,network=Network(cc98e8b1-8169-4a08-8b22-cd8a87c017a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc6d9d4f-93') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:38:04 compute-1 nova_compute[183403]: 2026-01-26 15:38:04.186 183407 DEBUG os_vif [None req-52fe5e41-9e87-476e-83d3-07a01a3a6589 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:83:18:02,bridge_name='br-int',has_traffic_filtering=True,id=fc6d9d4f-93a9-4b14-81b9-6d4d927635b5,network=Network(cc98e8b1-8169-4a08-8b22-cd8a87c017a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc6d9d4f-93') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 26 15:38:04 compute-1 nova_compute[183403]: 2026-01-26 15:38:04.189 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:38:04 compute-1 nova_compute[183403]: 2026-01-26 15:38:04.190 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfc6d9d4f-93, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:38:04 compute-1 nova_compute[183403]: 2026-01-26 15:38:04.229 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:38:04 compute-1 nova_compute[183403]: 2026-01-26 15:38:04.234 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:38:04 compute-1 nova_compute[183403]: 2026-01-26 15:38:04.235 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:38:04 compute-1 nova_compute[183403]: 2026-01-26 15:38:04.235 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=b597b0ec-31e1-4960-ae31-26ca4c6f5403) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:38:04 compute-1 nova_compute[183403]: 2026-01-26 15:38:04.237 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:38:04 compute-1 nova_compute[183403]: 2026-01-26 15:38:04.238 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:38:04 compute-1 nova_compute[183403]: 2026-01-26 15:38:04.246 183407 INFO os_vif [None req-52fe5e41-9e87-476e-83d3-07a01a3a6589 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:83:18:02,bridge_name='br-int',has_traffic_filtering=True,id=fc6d9d4f-93a9-4b14-81b9-6d4d927635b5,network=Network(cc98e8b1-8169-4a08-8b22-cd8a87c017a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc6d9d4f-93')
Jan 26 15:38:04 compute-1 nova_compute[183403]: 2026-01-26 15:38:04.246 183407 INFO nova.virt.libvirt.driver [None req-52fe5e41-9e87-476e-83d3-07a01a3a6589 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] [instance: 5f9467db-9e52-4153-bd8b-23e40544aee2] Deleting instance files /var/lib/nova/instances/5f9467db-9e52-4153-bd8b-23e40544aee2_del
Jan 26 15:38:04 compute-1 nova_compute[183403]: 2026-01-26 15:38:04.247 183407 INFO nova.virt.libvirt.driver [None req-52fe5e41-9e87-476e-83d3-07a01a3a6589 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] [instance: 5f9467db-9e52-4153-bd8b-23e40544aee2] Deletion of /var/lib/nova/instances/5f9467db-9e52-4153-bd8b-23e40544aee2_del complete
Jan 26 15:38:04 compute-1 nova_compute[183403]: 2026-01-26 15:38:04.679 183407 DEBUG nova.compute.manager [req-53e5d7a4-cd6d-4acd-ac52-e4c7403bd9f7 req-9a96968c-67d9-44bd-84ab-40048a146172 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 5f9467db-9e52-4153-bd8b-23e40544aee2] Received event network-vif-unplugged-fc6d9d4f-93a9-4b14-81b9-6d4d927635b5 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:38:04 compute-1 nova_compute[183403]: 2026-01-26 15:38:04.679 183407 DEBUG oslo_concurrency.lockutils [req-53e5d7a4-cd6d-4acd-ac52-e4c7403bd9f7 req-9a96968c-67d9-44bd-84ab-40048a146172 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "5f9467db-9e52-4153-bd8b-23e40544aee2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:38:04 compute-1 nova_compute[183403]: 2026-01-26 15:38:04.680 183407 DEBUG oslo_concurrency.lockutils [req-53e5d7a4-cd6d-4acd-ac52-e4c7403bd9f7 req-9a96968c-67d9-44bd-84ab-40048a146172 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "5f9467db-9e52-4153-bd8b-23e40544aee2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:38:04 compute-1 nova_compute[183403]: 2026-01-26 15:38:04.680 183407 DEBUG oslo_concurrency.lockutils [req-53e5d7a4-cd6d-4acd-ac52-e4c7403bd9f7 req-9a96968c-67d9-44bd-84ab-40048a146172 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "5f9467db-9e52-4153-bd8b-23e40544aee2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:38:04 compute-1 nova_compute[183403]: 2026-01-26 15:38:04.680 183407 DEBUG nova.compute.manager [req-53e5d7a4-cd6d-4acd-ac52-e4c7403bd9f7 req-9a96968c-67d9-44bd-84ab-40048a146172 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 5f9467db-9e52-4153-bd8b-23e40544aee2] No waiting events found dispatching network-vif-unplugged-fc6d9d4f-93a9-4b14-81b9-6d4d927635b5 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:38:04 compute-1 nova_compute[183403]: 2026-01-26 15:38:04.681 183407 DEBUG nova.compute.manager [req-53e5d7a4-cd6d-4acd-ac52-e4c7403bd9f7 req-9a96968c-67d9-44bd-84ab-40048a146172 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 5f9467db-9e52-4153-bd8b-23e40544aee2] Received event network-vif-unplugged-fc6d9d4f-93a9-4b14-81b9-6d4d927635b5 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 15:38:04 compute-1 nova_compute[183403]: 2026-01-26 15:38:04.899 183407 INFO nova.compute.manager [None req-52fe5e41-9e87-476e-83d3-07a01a3a6589 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] [instance: 5f9467db-9e52-4153-bd8b-23e40544aee2] Took 1.51 seconds to destroy the instance on the hypervisor.
Jan 26 15:38:04 compute-1 nova_compute[183403]: 2026-01-26 15:38:04.900 183407 DEBUG oslo.service.backend._eventlet.loopingcall [None req-52fe5e41-9e87-476e-83d3-07a01a3a6589 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Jan 26 15:38:04 compute-1 nova_compute[183403]: 2026-01-26 15:38:04.900 183407 DEBUG nova.compute.manager [-] [instance: 5f9467db-9e52-4153-bd8b-23e40544aee2] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Jan 26 15:38:04 compute-1 nova_compute[183403]: 2026-01-26 15:38:04.901 183407 DEBUG nova.network.neutron [-] [instance: 5f9467db-9e52-4153-bd8b-23e40544aee2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Jan 26 15:38:04 compute-1 nova_compute[183403]: 2026-01-26 15:38:04.901 183407 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:38:05 compute-1 nova_compute[183403]: 2026-01-26 15:38:05.126 183407 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:38:05 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:38:05.373 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=34, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:ac:38', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'de:96:40:90:f3:e5'}, ipsec=False) old=SB_Global(nb_cfg=33) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:38:05 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:38:05.374 104930 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 15:38:05 compute-1 nova_compute[183403]: 2026-01-26 15:38:05.374 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:38:05 compute-1 nova_compute[183403]: 2026-01-26 15:38:05.436 183407 DEBUG nova.compute.manager [req-6ee849f2-29f7-4cc5-a6f7-75d27ecdb48e req-1d3dd520-77f3-4948-b514-1cc591fb1476 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 5f9467db-9e52-4153-bd8b-23e40544aee2] Received event network-vif-deleted-fc6d9d4f-93a9-4b14-81b9-6d4d927635b5 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:38:05 compute-1 nova_compute[183403]: 2026-01-26 15:38:05.436 183407 INFO nova.compute.manager [req-6ee849f2-29f7-4cc5-a6f7-75d27ecdb48e req-1d3dd520-77f3-4948-b514-1cc591fb1476 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 5f9467db-9e52-4153-bd8b-23e40544aee2] Neutron deleted interface fc6d9d4f-93a9-4b14-81b9-6d4d927635b5; detaching it from the instance and deleting it from the info cache
Jan 26 15:38:05 compute-1 nova_compute[183403]: 2026-01-26 15:38:05.436 183407 DEBUG nova.network.neutron [req-6ee849f2-29f7-4cc5-a6f7-75d27ecdb48e req-1d3dd520-77f3-4948-b514-1cc591fb1476 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 5f9467db-9e52-4153-bd8b-23e40544aee2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:38:05 compute-1 podman[192725]: time="2026-01-26T15:38:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:38:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:38:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16575 "" "Go-http-client/1.1"
Jan 26 15:38:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:38:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2654 "" "Go-http-client/1.1"
Jan 26 15:38:05 compute-1 nova_compute[183403]: 2026-01-26 15:38:05.881 183407 DEBUG nova.network.neutron [-] [instance: 5f9467db-9e52-4153-bd8b-23e40544aee2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:38:05 compute-1 nova_compute[183403]: 2026-01-26 15:38:05.943 183407 DEBUG nova.compute.manager [req-6ee849f2-29f7-4cc5-a6f7-75d27ecdb48e req-1d3dd520-77f3-4948-b514-1cc591fb1476 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 5f9467db-9e52-4153-bd8b-23e40544aee2] Detach interface failed, port_id=fc6d9d4f-93a9-4b14-81b9-6d4d927635b5, reason: Instance 5f9467db-9e52-4153-bd8b-23e40544aee2 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Jan 26 15:38:06 compute-1 nova_compute[183403]: 2026-01-26 15:38:06.388 183407 INFO nova.compute.manager [-] [instance: 5f9467db-9e52-4153-bd8b-23e40544aee2] Took 1.49 seconds to deallocate network for instance.
Jan 26 15:38:06 compute-1 nova_compute[183403]: 2026-01-26 15:38:06.738 183407 DEBUG nova.compute.manager [req-b3cf5265-cab6-4df9-af8e-e556965679f1 req-16b442fd-9fd6-4005-8eb8-581b9f6c1a6a 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 5f9467db-9e52-4153-bd8b-23e40544aee2] Received event network-vif-unplugged-fc6d9d4f-93a9-4b14-81b9-6d4d927635b5 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:38:06 compute-1 nova_compute[183403]: 2026-01-26 15:38:06.738 183407 DEBUG oslo_concurrency.lockutils [req-b3cf5265-cab6-4df9-af8e-e556965679f1 req-16b442fd-9fd6-4005-8eb8-581b9f6c1a6a 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "5f9467db-9e52-4153-bd8b-23e40544aee2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:38:06 compute-1 nova_compute[183403]: 2026-01-26 15:38:06.739 183407 DEBUG oslo_concurrency.lockutils [req-b3cf5265-cab6-4df9-af8e-e556965679f1 req-16b442fd-9fd6-4005-8eb8-581b9f6c1a6a 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "5f9467db-9e52-4153-bd8b-23e40544aee2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:38:06 compute-1 nova_compute[183403]: 2026-01-26 15:38:06.739 183407 DEBUG oslo_concurrency.lockutils [req-b3cf5265-cab6-4df9-af8e-e556965679f1 req-16b442fd-9fd6-4005-8eb8-581b9f6c1a6a 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "5f9467db-9e52-4153-bd8b-23e40544aee2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:38:06 compute-1 nova_compute[183403]: 2026-01-26 15:38:06.739 183407 DEBUG nova.compute.manager [req-b3cf5265-cab6-4df9-af8e-e556965679f1 req-16b442fd-9fd6-4005-8eb8-581b9f6c1a6a 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 5f9467db-9e52-4153-bd8b-23e40544aee2] No waiting events found dispatching network-vif-unplugged-fc6d9d4f-93a9-4b14-81b9-6d4d927635b5 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:38:06 compute-1 nova_compute[183403]: 2026-01-26 15:38:06.739 183407 WARNING nova.compute.manager [req-b3cf5265-cab6-4df9-af8e-e556965679f1 req-16b442fd-9fd6-4005-8eb8-581b9f6c1a6a 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 5f9467db-9e52-4153-bd8b-23e40544aee2] Received unexpected event network-vif-unplugged-fc6d9d4f-93a9-4b14-81b9-6d4d927635b5 for instance with vm_state deleted and task_state None.
Jan 26 15:38:06 compute-1 nova_compute[183403]: 2026-01-26 15:38:06.931 183407 DEBUG oslo_concurrency.lockutils [None req-52fe5e41-9e87-476e-83d3-07a01a3a6589 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:38:06 compute-1 nova_compute[183403]: 2026-01-26 15:38:06.932 183407 DEBUG oslo_concurrency.lockutils [None req-52fe5e41-9e87-476e-83d3-07a01a3a6589 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:38:06 compute-1 nova_compute[183403]: 2026-01-26 15:38:06.999 183407 DEBUG nova.compute.provider_tree [None req-52fe5e41-9e87-476e-83d3-07a01a3a6589 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:38:07 compute-1 nova_compute[183403]: 2026-01-26 15:38:07.172 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:38:07 compute-1 nova_compute[183403]: 2026-01-26 15:38:07.508 183407 DEBUG nova.scheduler.client.report [None req-52fe5e41-9e87-476e-83d3-07a01a3a6589 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:38:07 compute-1 podman[214517]: 2026-01-26 15:38:07.907410871 +0000 UTC m=+0.076659857 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 15:38:07 compute-1 podman[214518]: 2026-01-26 15:38:07.914169192 +0000 UTC m=+0.081266547 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.component=ubi9-minimal-container, release=1755695350, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, architecture=x86_64, config_id=openstack_network_exporter, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, name=ubi9-minimal, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc.)
Jan 26 15:38:08 compute-1 nova_compute[183403]: 2026-01-26 15:38:08.020 183407 DEBUG oslo_concurrency.lockutils [None req-52fe5e41-9e87-476e-83d3-07a01a3a6589 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.088s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:38:08 compute-1 nova_compute[183403]: 2026-01-26 15:38:08.048 183407 INFO nova.scheduler.client.report [None req-52fe5e41-9e87-476e-83d3-07a01a3a6589 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Deleted allocations for instance 5f9467db-9e52-4153-bd8b-23e40544aee2
Jan 26 15:38:09 compute-1 nova_compute[183403]: 2026-01-26 15:38:09.237 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:38:09 compute-1 nova_compute[183403]: 2026-01-26 15:38:09.272 183407 DEBUG oslo_concurrency.lockutils [None req-52fe5e41-9e87-476e-83d3-07a01a3a6589 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Lock "5f9467db-9e52-4153-bd8b-23e40544aee2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.426s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:38:11 compute-1 nova_compute[183403]: 2026-01-26 15:38:11.096 183407 DEBUG oslo_concurrency.lockutils [None req-3fe3e2a3-7f52-4969-9006-fa47d09f475c 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Acquiring lock "ddc6e2e7-fe6a-4589-a4e8-00138e842f1d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:38:11 compute-1 nova_compute[183403]: 2026-01-26 15:38:11.097 183407 DEBUG oslo_concurrency.lockutils [None req-3fe3e2a3-7f52-4969-9006-fa47d09f475c 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Lock "ddc6e2e7-fe6a-4589-a4e8-00138e842f1d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:38:11 compute-1 nova_compute[183403]: 2026-01-26 15:38:11.097 183407 DEBUG oslo_concurrency.lockutils [None req-3fe3e2a3-7f52-4969-9006-fa47d09f475c 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Acquiring lock "ddc6e2e7-fe6a-4589-a4e8-00138e842f1d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:38:11 compute-1 nova_compute[183403]: 2026-01-26 15:38:11.098 183407 DEBUG oslo_concurrency.lockutils [None req-3fe3e2a3-7f52-4969-9006-fa47d09f475c 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Lock "ddc6e2e7-fe6a-4589-a4e8-00138e842f1d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:38:11 compute-1 nova_compute[183403]: 2026-01-26 15:38:11.098 183407 DEBUG oslo_concurrency.lockutils [None req-3fe3e2a3-7f52-4969-9006-fa47d09f475c 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Lock "ddc6e2e7-fe6a-4589-a4e8-00138e842f1d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:38:11 compute-1 nova_compute[183403]: 2026-01-26 15:38:11.116 183407 INFO nova.compute.manager [None req-3fe3e2a3-7f52-4969-9006-fa47d09f475c 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] [instance: ddc6e2e7-fe6a-4589-a4e8-00138e842f1d] Terminating instance
Jan 26 15:38:11 compute-1 nova_compute[183403]: 2026-01-26 15:38:11.637 183407 DEBUG nova.compute.manager [None req-3fe3e2a3-7f52-4969-9006-fa47d09f475c 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] [instance: ddc6e2e7-fe6a-4589-a4e8-00138e842f1d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Jan 26 15:38:11 compute-1 kernel: tap8152de5c-ac (unregistering): left promiscuous mode
Jan 26 15:38:11 compute-1 NetworkManager[55716]: <info>  [1769441891.6605] device (tap8152de5c-ac): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 15:38:11 compute-1 ovn_controller[95641]: 2026-01-26T15:38:11Z|00232|binding|INFO|Releasing lport 8152de5c-ac6c-45dd-8313-2c972cb67562 from this chassis (sb_readonly=0)
Jan 26 15:38:11 compute-1 ovn_controller[95641]: 2026-01-26T15:38:11Z|00233|binding|INFO|Setting lport 8152de5c-ac6c-45dd-8313-2c972cb67562 down in Southbound
Jan 26 15:38:11 compute-1 nova_compute[183403]: 2026-01-26 15:38:11.667 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:38:11 compute-1 ovn_controller[95641]: 2026-01-26T15:38:11Z|00234|binding|INFO|Removing iface tap8152de5c-ac ovn-installed in OVS
Jan 26 15:38:11 compute-1 nova_compute[183403]: 2026-01-26 15:38:11.669 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:38:11 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:38:11.675 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:44:9b:fd 10.100.0.9'], port_security=['fa:16:3e:44:9b:fd 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'ddc6e2e7-fe6a-4589-a4e8-00138e842f1d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc98e8b1-8169-4a08-8b22-cd8a87c017a0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f7ddd8ab2ae4841a1f43ae8078bb924', 'neutron:revision_number': '15', 'neutron:security_group_ids': 'c563489a-e307-4381-b22d-8f22c6dbbfd6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=76661f47-b7c7-4131-9a1a-0f8828404115, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], logical_port=8152de5c-ac6c-45dd-8313-2c972cb67562) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:38:11 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:38:11.677 104930 INFO neutron.agent.ovn.metadata.agent [-] Port 8152de5c-ac6c-45dd-8313-2c972cb67562 in datapath cc98e8b1-8169-4a08-8b22-cd8a87c017a0 unbound from our chassis
Jan 26 15:38:11 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:38:11.678 104930 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cc98e8b1-8169-4a08-8b22-cd8a87c017a0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 15:38:11 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:38:11.679 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[d44eda9c-cad5-4aeb-af48-8aa97dc5f525]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:38:11 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:38:11.680 104930 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0 namespace which is not needed anymore
Jan 26 15:38:11 compute-1 nova_compute[183403]: 2026-01-26 15:38:11.697 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:38:11 compute-1 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000001c.scope: Deactivated successfully.
Jan 26 15:38:11 compute-1 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000001c.scope: Consumed 2.441s CPU time.
Jan 26 15:38:11 compute-1 systemd-machined[154697]: Machine qemu-21-instance-0000001c terminated.
Jan 26 15:38:11 compute-1 nova_compute[183403]: 2026-01-26 15:38:11.816 183407 DEBUG nova.compute.manager [req-1f274e27-e7ea-4181-8d66-cef93f616d33 req-6224cee9-0659-4996-bb3d-b9bc61930181 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: ddc6e2e7-fe6a-4589-a4e8-00138e842f1d] Received event network-vif-unplugged-8152de5c-ac6c-45dd-8313-2c972cb67562 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:38:11 compute-1 nova_compute[183403]: 2026-01-26 15:38:11.817 183407 DEBUG oslo_concurrency.lockutils [req-1f274e27-e7ea-4181-8d66-cef93f616d33 req-6224cee9-0659-4996-bb3d-b9bc61930181 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "ddc6e2e7-fe6a-4589-a4e8-00138e842f1d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:38:11 compute-1 nova_compute[183403]: 2026-01-26 15:38:11.818 183407 DEBUG oslo_concurrency.lockutils [req-1f274e27-e7ea-4181-8d66-cef93f616d33 req-6224cee9-0659-4996-bb3d-b9bc61930181 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "ddc6e2e7-fe6a-4589-a4e8-00138e842f1d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:38:11 compute-1 nova_compute[183403]: 2026-01-26 15:38:11.818 183407 DEBUG oslo_concurrency.lockutils [req-1f274e27-e7ea-4181-8d66-cef93f616d33 req-6224cee9-0659-4996-bb3d-b9bc61930181 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "ddc6e2e7-fe6a-4589-a4e8-00138e842f1d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:38:11 compute-1 nova_compute[183403]: 2026-01-26 15:38:11.819 183407 DEBUG nova.compute.manager [req-1f274e27-e7ea-4181-8d66-cef93f616d33 req-6224cee9-0659-4996-bb3d-b9bc61930181 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: ddc6e2e7-fe6a-4589-a4e8-00138e842f1d] No waiting events found dispatching network-vif-unplugged-8152de5c-ac6c-45dd-8313-2c972cb67562 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:38:11 compute-1 nova_compute[183403]: 2026-01-26 15:38:11.819 183407 DEBUG nova.compute.manager [req-1f274e27-e7ea-4181-8d66-cef93f616d33 req-6224cee9-0659-4996-bb3d-b9bc61930181 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: ddc6e2e7-fe6a-4589-a4e8-00138e842f1d] Received event network-vif-unplugged-8152de5c-ac6c-45dd-8313-2c972cb67562 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 15:38:11 compute-1 neutron-haproxy-ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0[214237]: [NOTICE]   (214249) : haproxy version is 3.0.5-8e879a5
Jan 26 15:38:11 compute-1 neutron-haproxy-ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0[214237]: [NOTICE]   (214249) : path to executable is /usr/sbin/haproxy
Jan 26 15:38:11 compute-1 podman[214587]: 2026-01-26 15:38:11.861530661 +0000 UTC m=+0.050796076 container kill 7e613202ca7ad6ea01b69c3aa74a8eea4e003054211e6bdff090cb72b7ec8bed (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 26 15:38:11 compute-1 neutron-haproxy-ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0[214237]: [WARNING]  (214249) : Exiting Master process...
Jan 26 15:38:11 compute-1 neutron-haproxy-ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0[214237]: [ALERT]    (214249) : Current worker (214251) exited with code 143 (Terminated)
Jan 26 15:38:11 compute-1 neutron-haproxy-ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0[214237]: [WARNING]  (214249) : All workers exited. Exiting... (0)
Jan 26 15:38:11 compute-1 systemd[1]: libpod-7e613202ca7ad6ea01b69c3aa74a8eea4e003054211e6bdff090cb72b7ec8bed.scope: Deactivated successfully.
Jan 26 15:38:11 compute-1 nova_compute[183403]: 2026-01-26 15:38:11.916 183407 INFO nova.virt.libvirt.driver [-] [instance: ddc6e2e7-fe6a-4589-a4e8-00138e842f1d] Instance destroyed successfully.
Jan 26 15:38:11 compute-1 nova_compute[183403]: 2026-01-26 15:38:11.918 183407 DEBUG nova.objects.instance [None req-3fe3e2a3-7f52-4969-9006-fa47d09f475c 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Lazy-loading 'resources' on Instance uuid ddc6e2e7-fe6a-4589-a4e8-00138e842f1d obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:38:11 compute-1 podman[214608]: 2026-01-26 15:38:11.928474092 +0000 UTC m=+0.028467385 container died 7e613202ca7ad6ea01b69c3aa74a8eea4e003054211e6bdff090cb72b7ec8bed (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true)
Jan 26 15:38:12 compute-1 nova_compute[183403]: 2026-01-26 15:38:12.174 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:38:12 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7e613202ca7ad6ea01b69c3aa74a8eea4e003054211e6bdff090cb72b7ec8bed-userdata-shm.mount: Deactivated successfully.
Jan 26 15:38:12 compute-1 systemd[1]: var-lib-containers-storage-overlay-806063821df49012ebc505ea3ce04c9947c8c17b3be388d1754d8012dd4d2102-merged.mount: Deactivated successfully.
Jan 26 15:38:12 compute-1 podman[214608]: 2026-01-26 15:38:12.377304443 +0000 UTC m=+0.477297746 container cleanup 7e613202ca7ad6ea01b69c3aa74a8eea4e003054211e6bdff090cb72b7ec8bed (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 15:38:12 compute-1 systemd[1]: libpod-conmon-7e613202ca7ad6ea01b69c3aa74a8eea4e003054211e6bdff090cb72b7ec8bed.scope: Deactivated successfully.
Jan 26 15:38:12 compute-1 podman[214614]: 2026-01-26 15:38:12.401192039 +0000 UTC m=+0.494213025 container remove 7e613202ca7ad6ea01b69c3aa74a8eea4e003054211e6bdff090cb72b7ec8bed (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 26 15:38:12 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:38:12.408 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[2eea48d9-197f-4df4-86c3-97060e805fce]: (4, ("Mon Jan 26 03:38:11 PM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0 (7e613202ca7ad6ea01b69c3aa74a8eea4e003054211e6bdff090cb72b7ec8bed)\n7e613202ca7ad6ea01b69c3aa74a8eea4e003054211e6bdff090cb72b7ec8bed\nMon Jan 26 03:38:11 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0 (7e613202ca7ad6ea01b69c3aa74a8eea4e003054211e6bdff090cb72b7ec8bed)\n7e613202ca7ad6ea01b69c3aa74a8eea4e003054211e6bdff090cb72b7ec8bed\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:38:12 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:38:12.409 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[1ff443a4-894c-4e17-a815-cc9ccb6a081f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:38:12 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:38:12.410 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/cc98e8b1-8169-4a08-8b22-cd8a87c017a0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/cc98e8b1-8169-4a08-8b22-cd8a87c017a0.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:38:12 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:38:12.410 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[194c7f60-c6a0-4cba-80e1-a5db1b912495]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:38:12 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:38:12.411 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcc98e8b1-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:38:12 compute-1 nova_compute[183403]: 2026-01-26 15:38:12.413 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:38:12 compute-1 kernel: tapcc98e8b1-80: left promiscuous mode
Jan 26 15:38:12 compute-1 nova_compute[183403]: 2026-01-26 15:38:12.428 183407 DEBUG nova.virt.libvirt.vif [None req-3fe3e2a3-7f52-4969-9006-fa47d09f475c 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2026-01-26T15:35:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-1988940178',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-1988940',id=28,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T15:36:03Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8f7ddd8ab2ae4841a1f43ae8078bb924',ramdisk_id='',reservation_id='r-a8nl1jf1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,reader,admin,member',clean_attempts='1',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-1156482628',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-1156482628-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T15:37:56Z,user_data=None,user_id='8152e350b54f44cabafc751c752d6f92',uuid=ddc6e2e7-fe6a-4589-a4e8-00138e842f1d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8152de5c-ac6c-45dd-8313-2c972cb67562", "address": "fa:16:3e:44:9b:fd", "network": {"id": "cc98e8b1-8169-4a08-8b22-cd8a87c017a0", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-1789164558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a18d09bfe0e7479c8a237dd032889317", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8152de5c-ac", "ovs_interfaceid": "8152de5c-ac6c-45dd-8313-2c972cb67562", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 26 15:38:12 compute-1 nova_compute[183403]: 2026-01-26 15:38:12.430 183407 DEBUG nova.network.os_vif_util [None req-3fe3e2a3-7f52-4969-9006-fa47d09f475c 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Converting VIF {"id": "8152de5c-ac6c-45dd-8313-2c972cb67562", "address": "fa:16:3e:44:9b:fd", "network": {"id": "cc98e8b1-8169-4a08-8b22-cd8a87c017a0", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-1789164558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a18d09bfe0e7479c8a237dd032889317", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8152de5c-ac", "ovs_interfaceid": "8152de5c-ac6c-45dd-8313-2c972cb67562", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:38:12 compute-1 nova_compute[183403]: 2026-01-26 15:38:12.433 183407 DEBUG nova.network.os_vif_util [None req-3fe3e2a3-7f52-4969-9006-fa47d09f475c 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:44:9b:fd,bridge_name='br-int',has_traffic_filtering=True,id=8152de5c-ac6c-45dd-8313-2c972cb67562,network=Network(cc98e8b1-8169-4a08-8b22-cd8a87c017a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8152de5c-ac') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:38:12 compute-1 nova_compute[183403]: 2026-01-26 15:38:12.434 183407 DEBUG os_vif [None req-3fe3e2a3-7f52-4969-9006-fa47d09f475c 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:44:9b:fd,bridge_name='br-int',has_traffic_filtering=True,id=8152de5c-ac6c-45dd-8313-2c972cb67562,network=Network(cc98e8b1-8169-4a08-8b22-cd8a87c017a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8152de5c-ac') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 26 15:38:12 compute-1 nova_compute[183403]: 2026-01-26 15:38:12.437 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:38:12 compute-1 nova_compute[183403]: 2026-01-26 15:38:12.438 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8152de5c-ac, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:38:12 compute-1 nova_compute[183403]: 2026-01-26 15:38:12.442 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:38:12 compute-1 nova_compute[183403]: 2026-01-26 15:38:12.444 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:38:12 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:38:12.447 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[c324c400-eeb2-46a1-976b-6456e4f02752]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:38:12 compute-1 nova_compute[183403]: 2026-01-26 15:38:12.448 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:38:12 compute-1 nova_compute[183403]: 2026-01-26 15:38:12.448 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=a496c152-c7f2-42b4-8fc8-4b8a34c04391) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:38:12 compute-1 nova_compute[183403]: 2026-01-26 15:38:12.451 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:38:12 compute-1 nova_compute[183403]: 2026-01-26 15:38:12.455 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:38:12 compute-1 nova_compute[183403]: 2026-01-26 15:38:12.460 183407 INFO os_vif [None req-3fe3e2a3-7f52-4969-9006-fa47d09f475c 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:44:9b:fd,bridge_name='br-int',has_traffic_filtering=True,id=8152de5c-ac6c-45dd-8313-2c972cb67562,network=Network(cc98e8b1-8169-4a08-8b22-cd8a87c017a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8152de5c-ac')
Jan 26 15:38:12 compute-1 nova_compute[183403]: 2026-01-26 15:38:12.461 183407 INFO nova.virt.libvirt.driver [None req-3fe3e2a3-7f52-4969-9006-fa47d09f475c 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] [instance: ddc6e2e7-fe6a-4589-a4e8-00138e842f1d] Deleting instance files /var/lib/nova/instances/ddc6e2e7-fe6a-4589-a4e8-00138e842f1d_del
Jan 26 15:38:12 compute-1 nova_compute[183403]: 2026-01-26 15:38:12.461 183407 INFO nova.virt.libvirt.driver [None req-3fe3e2a3-7f52-4969-9006-fa47d09f475c 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] [instance: ddc6e2e7-fe6a-4589-a4e8-00138e842f1d] Deletion of /var/lib/nova/instances/ddc6e2e7-fe6a-4589-a4e8-00138e842f1d_del complete
Jan 26 15:38:12 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:38:12.471 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[1b5a67ec-cbc1-4711-ab88-d7bcf57712a3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:38:12 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:38:12.472 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[4adb681c-6119-43ad-b9a4-2b4b8c337d20]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:38:12 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:38:12.497 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[a8ada288-f7bd-416d-9bbf-581daca71106]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 554924, 'reachable_time': 26071, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214652, 'error': None, 'target': 'ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:38:12 compute-1 systemd[1]: run-netns-ovnmeta\x2dcc98e8b1\x2d8169\x2d4a08\x2d8b22\x2dcd8a87c017a0.mount: Deactivated successfully.
Jan 26 15:38:12 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:38:12.504 105448 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-cc98e8b1-8169-4a08-8b22-cd8a87c017a0 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Jan 26 15:38:12 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:38:12.505 105448 DEBUG oslo.privsep.daemon [-] privsep: reply[b97409e7-364d-4da1-8b2b-02c50bb8ac37]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:38:13 compute-1 nova_compute[183403]: 2026-01-26 15:38:13.225 183407 INFO nova.compute.manager [None req-3fe3e2a3-7f52-4969-9006-fa47d09f475c 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] [instance: ddc6e2e7-fe6a-4589-a4e8-00138e842f1d] Took 1.59 seconds to destroy the instance on the hypervisor.
Jan 26 15:38:13 compute-1 nova_compute[183403]: 2026-01-26 15:38:13.225 183407 DEBUG oslo.service.backend._eventlet.loopingcall [None req-3fe3e2a3-7f52-4969-9006-fa47d09f475c 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Jan 26 15:38:13 compute-1 nova_compute[183403]: 2026-01-26 15:38:13.225 183407 DEBUG nova.compute.manager [-] [instance: ddc6e2e7-fe6a-4589-a4e8-00138e842f1d] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Jan 26 15:38:13 compute-1 nova_compute[183403]: 2026-01-26 15:38:13.225 183407 DEBUG nova.network.neutron [-] [instance: ddc6e2e7-fe6a-4589-a4e8-00138e842f1d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Jan 26 15:38:13 compute-1 nova_compute[183403]: 2026-01-26 15:38:13.226 183407 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:38:13 compute-1 nova_compute[183403]: 2026-01-26 15:38:13.300 183407 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:38:13 compute-1 nova_compute[183403]: 2026-01-26 15:38:13.666 183407 DEBUG nova.compute.manager [req-269a0c69-5fa0-4881-be0f-9258c15442e3 req-edf856a0-e27f-4169-a6f4-f62c82409743 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: ddc6e2e7-fe6a-4589-a4e8-00138e842f1d] Received event network-vif-deleted-8152de5c-ac6c-45dd-8313-2c972cb67562 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:38:13 compute-1 nova_compute[183403]: 2026-01-26 15:38:13.666 183407 INFO nova.compute.manager [req-269a0c69-5fa0-4881-be0f-9258c15442e3 req-edf856a0-e27f-4169-a6f4-f62c82409743 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: ddc6e2e7-fe6a-4589-a4e8-00138e842f1d] Neutron deleted interface 8152de5c-ac6c-45dd-8313-2c972cb67562; detaching it from the instance and deleting it from the info cache
Jan 26 15:38:13 compute-1 nova_compute[183403]: 2026-01-26 15:38:13.666 183407 DEBUG nova.network.neutron [req-269a0c69-5fa0-4881-be0f-9258c15442e3 req-edf856a0-e27f-4169-a6f4-f62c82409743 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: ddc6e2e7-fe6a-4589-a4e8-00138e842f1d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:38:13 compute-1 nova_compute[183403]: 2026-01-26 15:38:13.860 183407 DEBUG nova.compute.manager [req-4c0c2c68-4aff-4f5a-b7be-17fecb5bddba req-351b271f-d940-4ed7-8695-c0ac6a7cb943 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: ddc6e2e7-fe6a-4589-a4e8-00138e842f1d] Received event network-vif-unplugged-8152de5c-ac6c-45dd-8313-2c972cb67562 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:38:13 compute-1 nova_compute[183403]: 2026-01-26 15:38:13.860 183407 DEBUG oslo_concurrency.lockutils [req-4c0c2c68-4aff-4f5a-b7be-17fecb5bddba req-351b271f-d940-4ed7-8695-c0ac6a7cb943 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "ddc6e2e7-fe6a-4589-a4e8-00138e842f1d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:38:13 compute-1 nova_compute[183403]: 2026-01-26 15:38:13.860 183407 DEBUG oslo_concurrency.lockutils [req-4c0c2c68-4aff-4f5a-b7be-17fecb5bddba req-351b271f-d940-4ed7-8695-c0ac6a7cb943 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "ddc6e2e7-fe6a-4589-a4e8-00138e842f1d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:38:13 compute-1 nova_compute[183403]: 2026-01-26 15:38:13.861 183407 DEBUG oslo_concurrency.lockutils [req-4c0c2c68-4aff-4f5a-b7be-17fecb5bddba req-351b271f-d940-4ed7-8695-c0ac6a7cb943 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "ddc6e2e7-fe6a-4589-a4e8-00138e842f1d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:38:13 compute-1 nova_compute[183403]: 2026-01-26 15:38:13.861 183407 DEBUG nova.compute.manager [req-4c0c2c68-4aff-4f5a-b7be-17fecb5bddba req-351b271f-d940-4ed7-8695-c0ac6a7cb943 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: ddc6e2e7-fe6a-4589-a4e8-00138e842f1d] No waiting events found dispatching network-vif-unplugged-8152de5c-ac6c-45dd-8313-2c972cb67562 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:38:13 compute-1 nova_compute[183403]: 2026-01-26 15:38:13.861 183407 DEBUG nova.compute.manager [req-4c0c2c68-4aff-4f5a-b7be-17fecb5bddba req-351b271f-d940-4ed7-8695-c0ac6a7cb943 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: ddc6e2e7-fe6a-4589-a4e8-00138e842f1d] Received event network-vif-unplugged-8152de5c-ac6c-45dd-8313-2c972cb67562 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 15:38:14 compute-1 nova_compute[183403]: 2026-01-26 15:38:14.123 183407 DEBUG nova.network.neutron [-] [instance: ddc6e2e7-fe6a-4589-a4e8-00138e842f1d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:38:14 compute-1 nova_compute[183403]: 2026-01-26 15:38:14.174 183407 DEBUG nova.compute.manager [req-269a0c69-5fa0-4881-be0f-9258c15442e3 req-edf856a0-e27f-4169-a6f4-f62c82409743 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: ddc6e2e7-fe6a-4589-a4e8-00138e842f1d] Detach interface failed, port_id=8152de5c-ac6c-45dd-8313-2c972cb67562, reason: Instance ddc6e2e7-fe6a-4589-a4e8-00138e842f1d could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Jan 26 15:38:14 compute-1 nova_compute[183403]: 2026-01-26 15:38:14.630 183407 INFO nova.compute.manager [-] [instance: ddc6e2e7-fe6a-4589-a4e8-00138e842f1d] Took 1.40 seconds to deallocate network for instance.
Jan 26 15:38:15 compute-1 nova_compute[183403]: 2026-01-26 15:38:15.153 183407 DEBUG oslo_concurrency.lockutils [None req-3fe3e2a3-7f52-4969-9006-fa47d09f475c 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:38:15 compute-1 nova_compute[183403]: 2026-01-26 15:38:15.154 183407 DEBUG oslo_concurrency.lockutils [None req-3fe3e2a3-7f52-4969-9006-fa47d09f475c 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:38:15 compute-1 nova_compute[183403]: 2026-01-26 15:38:15.161 183407 DEBUG oslo_concurrency.lockutils [None req-3fe3e2a3-7f52-4969-9006-fa47d09f475c 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.007s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:38:15 compute-1 nova_compute[183403]: 2026-01-26 15:38:15.200 183407 INFO nova.scheduler.client.report [None req-3fe3e2a3-7f52-4969-9006-fa47d09f475c 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Deleted allocations for instance ddc6e2e7-fe6a-4589-a4e8-00138e842f1d
Jan 26 15:38:15 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:38:15.377 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=41380b5a-e321-4ce4-bcc6-ecd563b3c793, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '34'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:38:16 compute-1 nova_compute[183403]: 2026-01-26 15:38:16.233 183407 DEBUG oslo_concurrency.lockutils [None req-3fe3e2a3-7f52-4969-9006-fa47d09f475c 8152e350b54f44cabafc751c752d6f92 8f7ddd8ab2ae4841a1f43ae8078bb924 - - default default] Lock "ddc6e2e7-fe6a-4589-a4e8-00138e842f1d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.136s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:38:17 compute-1 nova_compute[183403]: 2026-01-26 15:38:17.176 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:38:17 compute-1 nova_compute[183403]: 2026-01-26 15:38:17.449 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:38:18 compute-1 podman[214654]: 2026-01-26 15:38:18.905900032 +0000 UTC m=+0.076276586 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Jan 26 15:38:19 compute-1 podman[214653]: 2026-01-26 15:38:19.025067909 +0000 UTC m=+0.197367328 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20260120, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2)
Jan 26 15:38:19 compute-1 openstack_network_exporter[195610]: ERROR   15:38:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:38:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:38:19 compute-1 openstack_network_exporter[195610]: ERROR   15:38:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:38:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:38:21 compute-1 nova_compute[183403]: 2026-01-26 15:38:21.800 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:38:22 compute-1 nova_compute[183403]: 2026-01-26 15:38:22.178 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:38:22 compute-1 nova_compute[183403]: 2026-01-26 15:38:22.452 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:38:23 compute-1 nova_compute[183403]: 2026-01-26 15:38:23.577 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:38:27 compute-1 nova_compute[183403]: 2026-01-26 15:38:27.181 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:38:27 compute-1 nova_compute[183403]: 2026-01-26 15:38:27.454 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:38:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:38:29.096 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:38:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:38:29.097 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:38:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:38:29.097 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:38:29 compute-1 nova_compute[183403]: 2026-01-26 15:38:29.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:38:30 compute-1 nova_compute[183403]: 2026-01-26 15:38:30.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:38:31 compute-1 nova_compute[183403]: 2026-01-26 15:38:31.172 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:38:31 compute-1 nova_compute[183403]: 2026-01-26 15:38:31.173 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:38:31 compute-1 nova_compute[183403]: 2026-01-26 15:38:31.173 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:38:31 compute-1 nova_compute[183403]: 2026-01-26 15:38:31.173 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:38:31 compute-1 nova_compute[183403]: 2026-01-26 15:38:31.342 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:38:31 compute-1 nova_compute[183403]: 2026-01-26 15:38:31.344 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:38:31 compute-1 nova_compute[183403]: 2026-01-26 15:38:31.369 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.025s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:38:31 compute-1 nova_compute[183403]: 2026-01-26 15:38:31.370 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5843MB free_disk=73.14490509033203GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:38:31 compute-1 nova_compute[183403]: 2026-01-26 15:38:31.370 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:38:31 compute-1 nova_compute[183403]: 2026-01-26 15:38:31.370 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:38:32 compute-1 nova_compute[183403]: 2026-01-26 15:38:32.183 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:38:32 compute-1 nova_compute[183403]: 2026-01-26 15:38:32.456 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:38:32 compute-1 nova_compute[183403]: 2026-01-26 15:38:32.484 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:38:32 compute-1 nova_compute[183403]: 2026-01-26 15:38:32.484 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:38:31 up  1:33,  0 user,  load average: 0.14, 0.13, 0.18\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:38:32 compute-1 nova_compute[183403]: 2026-01-26 15:38:32.510 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:38:33 compute-1 nova_compute[183403]: 2026-01-26 15:38:33.018 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:38:33 compute-1 nova_compute[183403]: 2026-01-26 15:38:33.527 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:38:33 compute-1 nova_compute[183403]: 2026-01-26 15:38:33.527 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.157s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:38:33 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:38:33.538 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:38:11:27 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d9d38847-a43b-4d1e-a0b1-c3f77a879374', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '312beda09adb420b9f44a490d7257008', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f2b7e03b-5278-4177-91b7-862e57a7c9ad, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=70e6ec0a-21db-4d83-b0cb-0624424ede18) old=Port_Binding(mac=['fa:16:3e:38:11:27'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d9d38847-a43b-4d1e-a0b1-c3f77a879374', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '312beda09adb420b9f44a490d7257008', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:38:33 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:38:33.539 104930 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 70e6ec0a-21db-4d83-b0cb-0624424ede18 in datapath d9d38847-a43b-4d1e-a0b1-c3f77a879374 updated
Jan 26 15:38:33 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:38:33.540 104930 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d9d38847-a43b-4d1e-a0b1-c3f77a879374, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 15:38:33 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:38:33.541 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[30786198-71ea-4dff-aa34-b77439438e39]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:38:35 compute-1 nova_compute[183403]: 2026-01-26 15:38:35.527 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:38:35 compute-1 nova_compute[183403]: 2026-01-26 15:38:35.528 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:38:35 compute-1 nova_compute[183403]: 2026-01-26 15:38:35.528 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:38:35 compute-1 nova_compute[183403]: 2026-01-26 15:38:35.528 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:38:35 compute-1 nova_compute[183403]: 2026-01-26 15:38:35.529 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 15:38:35 compute-1 nova_compute[183403]: 2026-01-26 15:38:35.572 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:38:35 compute-1 podman[192725]: time="2026-01-26T15:38:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:38:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:38:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:38:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:38:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2195 "" "Go-http-client/1.1"
Jan 26 15:38:37 compute-1 nova_compute[183403]: 2026-01-26 15:38:37.185 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:38:37 compute-1 nova_compute[183403]: 2026-01-26 15:38:37.457 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:38:38 compute-1 podman[214699]: 2026-01-26 15:38:38.938706978 +0000 UTC m=+0.110552994 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 15:38:38 compute-1 podman[214700]: 2026-01-26 15:38:38.959058563 +0000 UTC m=+0.118980533 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, release=1755695350, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-type=git, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_id=openstack_network_exporter, maintainer=Red Hat, Inc., io.buildah.version=1.33.7)
Jan 26 15:38:42 compute-1 nova_compute[183403]: 2026-01-26 15:38:42.188 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:38:42 compute-1 nova_compute[183403]: 2026-01-26 15:38:42.459 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:38:43 compute-1 nova_compute[183403]: 2026-01-26 15:38:43.571 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:38:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:38:44.240 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dc:e6:b5 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-f2a3bacc-dfea-4ae8-b666-1477cd1f0fa2', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f2a3bacc-dfea-4ae8-b666-1477cd1f0fa2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'af8eea38f1d74ad1a01087c020ea8d02', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=067503a0-46cd-4882-8ed4-1827eeee02c6, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=338754cd-1e3d-4b8e-9c26-75fd1166b266) old=Port_Binding(mac=['fa:16:3e:dc:e6:b5'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-f2a3bacc-dfea-4ae8-b666-1477cd1f0fa2', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f2a3bacc-dfea-4ae8-b666-1477cd1f0fa2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'af8eea38f1d74ad1a01087c020ea8d02', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:38:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:38:44.241 104930 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 338754cd-1e3d-4b8e-9c26-75fd1166b266 in datapath f2a3bacc-dfea-4ae8-b666-1477cd1f0fa2 updated
Jan 26 15:38:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:38:44.241 104930 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f2a3bacc-dfea-4ae8-b666-1477cd1f0fa2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 15:38:44 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:38:44.243 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[a8ef4595-e2ad-422a-bfd0-8adeb4865720]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:38:47 compute-1 nova_compute[183403]: 2026-01-26 15:38:47.190 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:38:47 compute-1 nova_compute[183403]: 2026-01-26 15:38:47.461 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:38:49 compute-1 openstack_network_exporter[195610]: ERROR   15:38:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:38:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:38:49 compute-1 openstack_network_exporter[195610]: ERROR   15:38:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:38:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:38:49 compute-1 podman[214746]: 2026-01-26 15:38:49.911288327 +0000 UTC m=+0.069128614 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, container_name=ovn_metadata_agent)
Jan 26 15:38:49 compute-1 podman[214745]: 2026-01-26 15:38:49.947816949 +0000 UTC m=+0.116892964 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 15:38:52 compute-1 nova_compute[183403]: 2026-01-26 15:38:52.191 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:38:52 compute-1 nova_compute[183403]: 2026-01-26 15:38:52.462 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:38:54 compute-1 ovn_controller[95641]: 2026-01-26T15:38:54Z|00235|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Jan 26 15:38:57 compute-1 nova_compute[183403]: 2026-01-26 15:38:57.193 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:38:57 compute-1 nova_compute[183403]: 2026-01-26 15:38:57.463 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:39:02 compute-1 nova_compute[183403]: 2026-01-26 15:39:02.197 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:39:02 compute-1 nova_compute[183403]: 2026-01-26 15:39:02.465 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:39:05 compute-1 podman[192725]: time="2026-01-26T15:39:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:39:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:39:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:39:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:39:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2189 "" "Go-http-client/1.1"
Jan 26 15:39:07 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:39:07.136 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=35, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:ac:38', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'de:96:40:90:f3:e5'}, ipsec=False) old=SB_Global(nb_cfg=34) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:39:07 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:39:07.137 104930 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 15:39:07 compute-1 nova_compute[183403]: 2026-01-26 15:39:07.138 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:39:07 compute-1 nova_compute[183403]: 2026-01-26 15:39:07.198 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:39:07 compute-1 nova_compute[183403]: 2026-01-26 15:39:07.467 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:39:09 compute-1 podman[214788]: 2026-01-26 15:39:09.881741802 +0000 UTC m=+0.063229458 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 15:39:09 compute-1 podman[214789]: 2026-01-26 15:39:09.886620299 +0000 UTC m=+0.062742454 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, distribution-scope=public, maintainer=Red Hat, Inc., name=ubi9-minimal, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter)
Jan 26 15:39:12 compute-1 nova_compute[183403]: 2026-01-26 15:39:12.199 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:39:12 compute-1 nova_compute[183403]: 2026-01-26 15:39:12.469 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:39:14 compute-1 nova_compute[183403]: 2026-01-26 15:39:14.789 183407 DEBUG oslo_concurrency.lockutils [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Acquiring lock "b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:39:14 compute-1 nova_compute[183403]: 2026-01-26 15:39:14.790 183407 DEBUG oslo_concurrency.lockutils [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Lock "b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:39:15 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:39:15.138 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=41380b5a-e321-4ce4-bcc6-ecd563b3c793, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '35'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:39:15 compute-1 nova_compute[183403]: 2026-01-26 15:39:15.295 183407 DEBUG nova.compute.manager [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Jan 26 15:39:16 compute-1 nova_compute[183403]: 2026-01-26 15:39:16.197 183407 DEBUG oslo_concurrency.lockutils [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:39:16 compute-1 nova_compute[183403]: 2026-01-26 15:39:16.197 183407 DEBUG oslo_concurrency.lockutils [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:39:16 compute-1 nova_compute[183403]: 2026-01-26 15:39:16.204 183407 DEBUG nova.virt.hardware [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Jan 26 15:39:16 compute-1 nova_compute[183403]: 2026-01-26 15:39:16.205 183407 INFO nova.compute.claims [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] Claim successful on node compute-1.ctlplane.example.com
Jan 26 15:39:17 compute-1 nova_compute[183403]: 2026-01-26 15:39:17.201 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:39:17 compute-1 nova_compute[183403]: 2026-01-26 15:39:17.265 183407 DEBUG nova.compute.provider_tree [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:39:17 compute-1 nova_compute[183403]: 2026-01-26 15:39:17.517 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:39:17 compute-1 nova_compute[183403]: 2026-01-26 15:39:17.777 183407 DEBUG nova.scheduler.client.report [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:39:18 compute-1 nova_compute[183403]: 2026-01-26 15:39:18.287 183407 DEBUG oslo_concurrency.lockutils [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.090s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:39:18 compute-1 nova_compute[183403]: 2026-01-26 15:39:18.289 183407 DEBUG nova.compute.manager [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Jan 26 15:39:18 compute-1 nova_compute[183403]: 2026-01-26 15:39:18.850 183407 DEBUG nova.compute.manager [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Jan 26 15:39:18 compute-1 nova_compute[183403]: 2026-01-26 15:39:18.851 183407 DEBUG nova.network.neutron [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Jan 26 15:39:18 compute-1 nova_compute[183403]: 2026-01-26 15:39:18.851 183407 WARNING neutronclient.v2_0.client [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:39:18 compute-1 nova_compute[183403]: 2026-01-26 15:39:18.852 183407 WARNING neutronclient.v2_0.client [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:39:19 compute-1 nova_compute[183403]: 2026-01-26 15:39:19.385 183407 INFO nova.virt.libvirt.driver [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 15:39:19 compute-1 openstack_network_exporter[195610]: ERROR   15:39:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:39:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:39:19 compute-1 openstack_network_exporter[195610]: ERROR   15:39:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:39:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:39:19 compute-1 nova_compute[183403]: 2026-01-26 15:39:19.903 183407 DEBUG nova.compute.manager [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Jan 26 15:39:19 compute-1 nova_compute[183403]: 2026-01-26 15:39:19.974 183407 DEBUG nova.network.neutron [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] Successfully created port: b0d5fa76-2cd0-497f-9f26-af6ef0c5d9f4 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Jan 26 15:39:20 compute-1 podman[214830]: 2026-01-26 15:39:20.899310711 +0000 UTC m=+0.083398177 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest, tcib_managed=true)
Jan 26 15:39:20 compute-1 podman[214831]: 2026-01-26 15:39:20.899396284 +0000 UTC m=+0.080855636 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120)
Jan 26 15:39:20 compute-1 nova_compute[183403]: 2026-01-26 15:39:20.921 183407 DEBUG nova.compute.manager [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Jan 26 15:39:20 compute-1 nova_compute[183403]: 2026-01-26 15:39:20.923 183407 DEBUG nova.virt.libvirt.driver [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Jan 26 15:39:20 compute-1 nova_compute[183403]: 2026-01-26 15:39:20.924 183407 INFO nova.virt.libvirt.driver [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] Creating image(s)
Jan 26 15:39:20 compute-1 nova_compute[183403]: 2026-01-26 15:39:20.924 183407 DEBUG oslo_concurrency.lockutils [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Acquiring lock "/var/lib/nova/instances/b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:39:20 compute-1 nova_compute[183403]: 2026-01-26 15:39:20.925 183407 DEBUG oslo_concurrency.lockutils [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Lock "/var/lib/nova/instances/b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:39:20 compute-1 nova_compute[183403]: 2026-01-26 15:39:20.927 183407 DEBUG oslo_concurrency.lockutils [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Lock "/var/lib/nova/instances/b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:39:20 compute-1 nova_compute[183403]: 2026-01-26 15:39:20.928 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:39:20 compute-1 nova_compute[183403]: 2026-01-26 15:39:20.930 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:39:20 compute-1 nova_compute[183403]: 2026-01-26 15:39:20.932 183407 DEBUG oslo_concurrency.processutils [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:39:21 compute-1 nova_compute[183403]: 2026-01-26 15:39:21.013 183407 DEBUG oslo_concurrency.processutils [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:39:21 compute-1 nova_compute[183403]: 2026-01-26 15:39:21.014 183407 DEBUG oslo_concurrency.lockutils [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Acquiring lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:39:21 compute-1 nova_compute[183403]: 2026-01-26 15:39:21.014 183407 DEBUG oslo_concurrency.lockutils [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:39:21 compute-1 nova_compute[183403]: 2026-01-26 15:39:21.015 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:39:21 compute-1 nova_compute[183403]: 2026-01-26 15:39:21.018 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:39:21 compute-1 nova_compute[183403]: 2026-01-26 15:39:21.018 183407 DEBUG oslo_concurrency.processutils [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:39:21 compute-1 nova_compute[183403]: 2026-01-26 15:39:21.074 183407 DEBUG oslo_concurrency.processutils [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:39:21 compute-1 nova_compute[183403]: 2026-01-26 15:39:21.075 183407 DEBUG oslo_concurrency.processutils [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0,backing_fmt=raw /var/lib/nova/instances/b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:39:21 compute-1 nova_compute[183403]: 2026-01-26 15:39:21.114 183407 DEBUG oslo_concurrency.processutils [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0,backing_fmt=raw /var/lib/nova/instances/b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:39:21 compute-1 nova_compute[183403]: 2026-01-26 15:39:21.115 183407 DEBUG oslo_concurrency.lockutils [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.101s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:39:21 compute-1 nova_compute[183403]: 2026-01-26 15:39:21.116 183407 DEBUG oslo_concurrency.processutils [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:39:21 compute-1 nova_compute[183403]: 2026-01-26 15:39:21.161 183407 DEBUG oslo_concurrency.processutils [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:39:21 compute-1 nova_compute[183403]: 2026-01-26 15:39:21.162 183407 DEBUG nova.virt.disk.api [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Checking if we can resize image /var/lib/nova/instances/b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 26 15:39:21 compute-1 nova_compute[183403]: 2026-01-26 15:39:21.163 183407 DEBUG oslo_concurrency.processutils [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:39:21 compute-1 nova_compute[183403]: 2026-01-26 15:39:21.241 183407 DEBUG oslo_concurrency.processutils [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:39:21 compute-1 nova_compute[183403]: 2026-01-26 15:39:21.242 183407 DEBUG nova.virt.disk.api [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Cannot resize image /var/lib/nova/instances/b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 26 15:39:21 compute-1 nova_compute[183403]: 2026-01-26 15:39:21.242 183407 DEBUG nova.virt.libvirt.driver [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Jan 26 15:39:21 compute-1 nova_compute[183403]: 2026-01-26 15:39:21.242 183407 DEBUG nova.virt.libvirt.driver [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] Ensure instance console log exists: /var/lib/nova/instances/b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Jan 26 15:39:21 compute-1 nova_compute[183403]: 2026-01-26 15:39:21.243 183407 DEBUG oslo_concurrency.lockutils [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:39:21 compute-1 nova_compute[183403]: 2026-01-26 15:39:21.243 183407 DEBUG oslo_concurrency.lockutils [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:39:21 compute-1 nova_compute[183403]: 2026-01-26 15:39:21.243 183407 DEBUG oslo_concurrency.lockutils [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:39:21 compute-1 nova_compute[183403]: 2026-01-26 15:39:21.481 183407 DEBUG nova.network.neutron [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] Successfully updated port: b0d5fa76-2cd0-497f-9f26-af6ef0c5d9f4 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Jan 26 15:39:21 compute-1 nova_compute[183403]: 2026-01-26 15:39:21.543 183407 DEBUG nova.compute.manager [req-478bab36-965d-4464-891c-c98e0609e605 req-371be777-6ce4-4047-95f3-ee2972461458 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] Received event network-changed-b0d5fa76-2cd0-497f-9f26-af6ef0c5d9f4 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:39:21 compute-1 nova_compute[183403]: 2026-01-26 15:39:21.544 183407 DEBUG nova.compute.manager [req-478bab36-965d-4464-891c-c98e0609e605 req-371be777-6ce4-4047-95f3-ee2972461458 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] Refreshing instance network info cache due to event network-changed-b0d5fa76-2cd0-497f-9f26-af6ef0c5d9f4. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 26 15:39:21 compute-1 nova_compute[183403]: 2026-01-26 15:39:21.544 183407 DEBUG oslo_concurrency.lockutils [req-478bab36-965d-4464-891c-c98e0609e605 req-371be777-6ce4-4047-95f3-ee2972461458 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "refresh_cache-b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 15:39:21 compute-1 nova_compute[183403]: 2026-01-26 15:39:21.544 183407 DEBUG oslo_concurrency.lockutils [req-478bab36-965d-4464-891c-c98e0609e605 req-371be777-6ce4-4047-95f3-ee2972461458 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquired lock "refresh_cache-b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 15:39:21 compute-1 nova_compute[183403]: 2026-01-26 15:39:21.545 183407 DEBUG nova.network.neutron [req-478bab36-965d-4464-891c-c98e0609e605 req-371be777-6ce4-4047-95f3-ee2972461458 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] Refreshing network info cache for port b0d5fa76-2cd0-497f-9f26-af6ef0c5d9f4 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 26 15:39:21 compute-1 nova_compute[183403]: 2026-01-26 15:39:21.987 183407 DEBUG oslo_concurrency.lockutils [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Acquiring lock "refresh_cache-b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 15:39:22 compute-1 nova_compute[183403]: 2026-01-26 15:39:22.050 183407 WARNING neutronclient.v2_0.client [req-478bab36-965d-4464-891c-c98e0609e605 req-371be777-6ce4-4047-95f3-ee2972461458 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:39:22 compute-1 nova_compute[183403]: 2026-01-26 15:39:22.144 183407 DEBUG nova.network.neutron [req-478bab36-965d-4464-891c-c98e0609e605 req-371be777-6ce4-4047-95f3-ee2972461458 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 15:39:22 compute-1 nova_compute[183403]: 2026-01-26 15:39:22.204 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:39:22 compute-1 nova_compute[183403]: 2026-01-26 15:39:22.519 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:39:22 compute-1 nova_compute[183403]: 2026-01-26 15:39:22.521 183407 DEBUG nova.network.neutron [req-478bab36-965d-4464-891c-c98e0609e605 req-371be777-6ce4-4047-95f3-ee2972461458 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:39:23 compute-1 nova_compute[183403]: 2026-01-26 15:39:23.216 183407 DEBUG oslo_concurrency.lockutils [req-478bab36-965d-4464-891c-c98e0609e605 req-371be777-6ce4-4047-95f3-ee2972461458 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Releasing lock "refresh_cache-b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 15:39:23 compute-1 nova_compute[183403]: 2026-01-26 15:39:23.217 183407 DEBUG oslo_concurrency.lockutils [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Acquired lock "refresh_cache-b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 15:39:23 compute-1 nova_compute[183403]: 2026-01-26 15:39:23.218 183407 DEBUG nova.network.neutron [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 15:39:24 compute-1 nova_compute[183403]: 2026-01-26 15:39:24.490 183407 DEBUG nova.network.neutron [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 15:39:24 compute-1 nova_compute[183403]: 2026-01-26 15:39:24.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:39:25 compute-1 nova_compute[183403]: 2026-01-26 15:39:25.492 183407 WARNING neutronclient.v2_0.client [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:39:26 compute-1 nova_compute[183403]: 2026-01-26 15:39:26.551 183407 DEBUG nova.network.neutron [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] Updating instance_info_cache with network_info: [{"id": "b0d5fa76-2cd0-497f-9f26-af6ef0c5d9f4", "address": "fa:16:3e:9e:81:1e", "network": {"id": "d9d38847-a43b-4d1e-a0b1-c3f77a879374", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1542262107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "312beda09adb420b9f44a490d7257008", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0d5fa76-2c", "ovs_interfaceid": "b0d5fa76-2cd0-497f-9f26-af6ef0c5d9f4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:39:27 compute-1 nova_compute[183403]: 2026-01-26 15:39:27.057 183407 DEBUG oslo_concurrency.lockutils [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Releasing lock "refresh_cache-b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 15:39:27 compute-1 nova_compute[183403]: 2026-01-26 15:39:27.058 183407 DEBUG nova.compute.manager [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] Instance network_info: |[{"id": "b0d5fa76-2cd0-497f-9f26-af6ef0c5d9f4", "address": "fa:16:3e:9e:81:1e", "network": {"id": "d9d38847-a43b-4d1e-a0b1-c3f77a879374", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1542262107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "312beda09adb420b9f44a490d7257008", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0d5fa76-2c", "ovs_interfaceid": "b0d5fa76-2cd0-497f-9f26-af6ef0c5d9f4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Jan 26 15:39:27 compute-1 nova_compute[183403]: 2026-01-26 15:39:27.060 183407 DEBUG nova.virt.libvirt.driver [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] Start _get_guest_xml network_info=[{"id": "b0d5fa76-2cd0-497f-9f26-af6ef0c5d9f4", "address": "fa:16:3e:9e:81:1e", "network": {"id": "d9d38847-a43b-4d1e-a0b1-c3f77a879374", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1542262107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "312beda09adb420b9f44a490d7257008", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0d5fa76-2c", "ovs_interfaceid": "b0d5fa76-2cd0-497f-9f26-af6ef0c5d9f4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T15:01:22Z,direct_url=<?>,disk_format='qcow2',id=354e4d0e-4287-404f-93d3-2c85cfe92fbc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='179f3c996d8f4e7ea1b0aca3ec76f02e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T15:01:23Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'image_id': '354e4d0e-4287-404f-93d3-2c85cfe92fbc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Jan 26 15:39:27 compute-1 nova_compute[183403]: 2026-01-26 15:39:27.065 183407 WARNING nova.virt.libvirt.driver [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:39:27 compute-1 nova_compute[183403]: 2026-01-26 15:39:27.066 183407 DEBUG nova.virt.driver [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='354e4d0e-4287-404f-93d3-2c85cfe92fbc', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteZoneMigrationStrategy-server-810720634', uuid='b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab'), owner=OwnerMeta(userid='136d3cfdd6cb48e2ab65221bcc05d26c', username='tempest-TestExecuteZoneMigrationStrategy-1233966703-project-admin', projectid='af8eea38f1d74ad1a01087c020ea8d02', projectname='tempest-TestExecuteZoneMigrationStrategy-1233966703'), image=ImageMeta(id='354e4d0e-4287-404f-93d3-2c85cfe92fbc', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='74480e15-23e6-4569-8ef9-3ddf5ac8b981', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "b0d5fa76-2cd0-497f-9f26-af6ef0c5d9f4", "address": "fa:16:3e:9e:81:1e", "network": {"id": "d9d38847-a43b-4d1e-a0b1-c3f77a879374", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1542262107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "312beda09adb420b9f44a490d7257008", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0d5fa76-2c", "ovs_interfaceid": "b0d5fa76-2cd0-497f-9f26-af6ef0c5d9f4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1769441967.0666788) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Jan 26 15:39:27 compute-1 nova_compute[183403]: 2026-01-26 15:39:27.070 183407 DEBUG nova.virt.libvirt.host [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Jan 26 15:39:27 compute-1 nova_compute[183403]: 2026-01-26 15:39:27.071 183407 DEBUG nova.virt.libvirt.host [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Jan 26 15:39:27 compute-1 nova_compute[183403]: 2026-01-26 15:39:27.073 183407 DEBUG nova.virt.libvirt.host [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Jan 26 15:39:27 compute-1 nova_compute[183403]: 2026-01-26 15:39:27.074 183407 DEBUG nova.virt.libvirt.host [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Jan 26 15:39:27 compute-1 nova_compute[183403]: 2026-01-26 15:39:27.074 183407 DEBUG nova.virt.libvirt.driver [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Jan 26 15:39:27 compute-1 nova_compute[183403]: 2026-01-26 15:39:27.075 183407 DEBUG nova.virt.hardware [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T15:01:21Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='74480e15-23e6-4569-8ef9-3ddf5ac8b981',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T15:01:22Z,direct_url=<?>,disk_format='qcow2',id=354e4d0e-4287-404f-93d3-2c85cfe92fbc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='179f3c996d8f4e7ea1b0aca3ec76f02e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T15:01:23Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Jan 26 15:39:27 compute-1 nova_compute[183403]: 2026-01-26 15:39:27.075 183407 DEBUG nova.virt.hardware [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Jan 26 15:39:27 compute-1 nova_compute[183403]: 2026-01-26 15:39:27.075 183407 DEBUG nova.virt.hardware [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Jan 26 15:39:27 compute-1 nova_compute[183403]: 2026-01-26 15:39:27.075 183407 DEBUG nova.virt.hardware [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Jan 26 15:39:27 compute-1 nova_compute[183403]: 2026-01-26 15:39:27.076 183407 DEBUG nova.virt.hardware [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Jan 26 15:39:27 compute-1 nova_compute[183403]: 2026-01-26 15:39:27.076 183407 DEBUG nova.virt.hardware [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Jan 26 15:39:27 compute-1 nova_compute[183403]: 2026-01-26 15:39:27.076 183407 DEBUG nova.virt.hardware [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Jan 26 15:39:27 compute-1 nova_compute[183403]: 2026-01-26 15:39:27.076 183407 DEBUG nova.virt.hardware [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Jan 26 15:39:27 compute-1 nova_compute[183403]: 2026-01-26 15:39:27.077 183407 DEBUG nova.virt.hardware [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Jan 26 15:39:27 compute-1 nova_compute[183403]: 2026-01-26 15:39:27.077 183407 DEBUG nova.virt.hardware [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Jan 26 15:39:27 compute-1 nova_compute[183403]: 2026-01-26 15:39:27.077 183407 DEBUG nova.virt.hardware [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Jan 26 15:39:27 compute-1 nova_compute[183403]: 2026-01-26 15:39:27.080 183407 DEBUG nova.virt.libvirt.vif [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-01-26T15:39:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-810720634',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-810720634',id=31,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='af8eea38f1d74ad1a01087c020ea8d02',ramdisk_id='',reservation_id='r-hcbmag46',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-1233966703',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-1233966703-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T15:39:19Z,user_data=None,user_id='136d3cfdd6cb48e2ab65221bcc05d26c',uuid=b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b0d5fa76-2cd0-497f-9f26-af6ef0c5d9f4", "address": "fa:16:3e:9e:81:1e", "network": {"id": "d9d38847-a43b-4d1e-a0b1-c3f77a879374", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1542262107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "312beda09adb420b9f44a490d7257008", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0d5fa76-2c", "ovs_interfaceid": "b0d5fa76-2cd0-497f-9f26-af6ef0c5d9f4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 26 15:39:27 compute-1 nova_compute[183403]: 2026-01-26 15:39:27.080 183407 DEBUG nova.network.os_vif_util [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Converting VIF {"id": "b0d5fa76-2cd0-497f-9f26-af6ef0c5d9f4", "address": "fa:16:3e:9e:81:1e", "network": {"id": "d9d38847-a43b-4d1e-a0b1-c3f77a879374", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1542262107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "312beda09adb420b9f44a490d7257008", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0d5fa76-2c", "ovs_interfaceid": "b0d5fa76-2cd0-497f-9f26-af6ef0c5d9f4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:39:27 compute-1 nova_compute[183403]: 2026-01-26 15:39:27.081 183407 DEBUG nova.network.os_vif_util [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9e:81:1e,bridge_name='br-int',has_traffic_filtering=True,id=b0d5fa76-2cd0-497f-9f26-af6ef0c5d9f4,network=Network(d9d38847-a43b-4d1e-a0b1-c3f77a879374),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb0d5fa76-2c') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:39:27 compute-1 nova_compute[183403]: 2026-01-26 15:39:27.081 183407 DEBUG nova.objects.instance [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Lazy-loading 'pci_devices' on Instance uuid b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:39:27 compute-1 nova_compute[183403]: 2026-01-26 15:39:27.206 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:39:27 compute-1 nova_compute[183403]: 2026-01-26 15:39:27.521 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:39:27 compute-1 nova_compute[183403]: 2026-01-26 15:39:27.587 183407 DEBUG nova.virt.libvirt.driver [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] End _get_guest_xml xml=<domain type="kvm">
Jan 26 15:39:27 compute-1 nova_compute[183403]:   <uuid>b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab</uuid>
Jan 26 15:39:27 compute-1 nova_compute[183403]:   <name>instance-0000001f</name>
Jan 26 15:39:27 compute-1 nova_compute[183403]:   <memory>131072</memory>
Jan 26 15:39:27 compute-1 nova_compute[183403]:   <vcpu>1</vcpu>
Jan 26 15:39:27 compute-1 nova_compute[183403]:   <metadata>
Jan 26 15:39:27 compute-1 nova_compute[183403]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 15:39:27 compute-1 nova_compute[183403]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 15:39:27 compute-1 nova_compute[183403]:       <nova:name>tempest-TestExecuteZoneMigrationStrategy-server-810720634</nova:name>
Jan 26 15:39:27 compute-1 nova_compute[183403]:       <nova:creationTime>2026-01-26 15:39:27</nova:creationTime>
Jan 26 15:39:27 compute-1 nova_compute[183403]:       <nova:flavor name="m1.nano" id="74480e15-23e6-4569-8ef9-3ddf5ac8b981">
Jan 26 15:39:27 compute-1 nova_compute[183403]:         <nova:memory>128</nova:memory>
Jan 26 15:39:27 compute-1 nova_compute[183403]:         <nova:disk>1</nova:disk>
Jan 26 15:39:27 compute-1 nova_compute[183403]:         <nova:swap>0</nova:swap>
Jan 26 15:39:27 compute-1 nova_compute[183403]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 15:39:27 compute-1 nova_compute[183403]:         <nova:vcpus>1</nova:vcpus>
Jan 26 15:39:27 compute-1 nova_compute[183403]:         <nova:extraSpecs>
Jan 26 15:39:27 compute-1 nova_compute[183403]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 15:39:27 compute-1 nova_compute[183403]:         </nova:extraSpecs>
Jan 26 15:39:27 compute-1 nova_compute[183403]:       </nova:flavor>
Jan 26 15:39:27 compute-1 nova_compute[183403]:       <nova:image uuid="354e4d0e-4287-404f-93d3-2c85cfe92fbc">
Jan 26 15:39:27 compute-1 nova_compute[183403]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 15:39:27 compute-1 nova_compute[183403]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 15:39:27 compute-1 nova_compute[183403]:         <nova:minDisk>1</nova:minDisk>
Jan 26 15:39:27 compute-1 nova_compute[183403]:         <nova:minRam>0</nova:minRam>
Jan 26 15:39:27 compute-1 nova_compute[183403]:         <nova:properties>
Jan 26 15:39:27 compute-1 nova_compute[183403]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 15:39:27 compute-1 nova_compute[183403]:         </nova:properties>
Jan 26 15:39:27 compute-1 nova_compute[183403]:       </nova:image>
Jan 26 15:39:27 compute-1 nova_compute[183403]:       <nova:owner>
Jan 26 15:39:27 compute-1 nova_compute[183403]:         <nova:user uuid="136d3cfdd6cb48e2ab65221bcc05d26c">tempest-TestExecuteZoneMigrationStrategy-1233966703-project-admin</nova:user>
Jan 26 15:39:27 compute-1 nova_compute[183403]:         <nova:project uuid="af8eea38f1d74ad1a01087c020ea8d02">tempest-TestExecuteZoneMigrationStrategy-1233966703</nova:project>
Jan 26 15:39:27 compute-1 nova_compute[183403]:       </nova:owner>
Jan 26 15:39:27 compute-1 nova_compute[183403]:       <nova:root type="image" uuid="354e4d0e-4287-404f-93d3-2c85cfe92fbc"/>
Jan 26 15:39:27 compute-1 nova_compute[183403]:       <nova:ports>
Jan 26 15:39:27 compute-1 nova_compute[183403]:         <nova:port uuid="b0d5fa76-2cd0-497f-9f26-af6ef0c5d9f4">
Jan 26 15:39:27 compute-1 nova_compute[183403]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 26 15:39:27 compute-1 nova_compute[183403]:         </nova:port>
Jan 26 15:39:27 compute-1 nova_compute[183403]:       </nova:ports>
Jan 26 15:39:27 compute-1 nova_compute[183403]:     </nova:instance>
Jan 26 15:39:27 compute-1 nova_compute[183403]:   </metadata>
Jan 26 15:39:27 compute-1 nova_compute[183403]:   <sysinfo type="smbios">
Jan 26 15:39:27 compute-1 nova_compute[183403]:     <system>
Jan 26 15:39:27 compute-1 nova_compute[183403]:       <entry name="manufacturer">RDO</entry>
Jan 26 15:39:27 compute-1 nova_compute[183403]:       <entry name="product">OpenStack Compute</entry>
Jan 26 15:39:27 compute-1 nova_compute[183403]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 15:39:27 compute-1 nova_compute[183403]:       <entry name="serial">b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab</entry>
Jan 26 15:39:27 compute-1 nova_compute[183403]:       <entry name="uuid">b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab</entry>
Jan 26 15:39:27 compute-1 nova_compute[183403]:       <entry name="family">Virtual Machine</entry>
Jan 26 15:39:27 compute-1 nova_compute[183403]:     </system>
Jan 26 15:39:27 compute-1 nova_compute[183403]:   </sysinfo>
Jan 26 15:39:27 compute-1 nova_compute[183403]:   <os>
Jan 26 15:39:27 compute-1 nova_compute[183403]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 15:39:27 compute-1 nova_compute[183403]:     <boot dev="hd"/>
Jan 26 15:39:27 compute-1 nova_compute[183403]:     <smbios mode="sysinfo"/>
Jan 26 15:39:27 compute-1 nova_compute[183403]:   </os>
Jan 26 15:39:27 compute-1 nova_compute[183403]:   <features>
Jan 26 15:39:27 compute-1 nova_compute[183403]:     <acpi/>
Jan 26 15:39:27 compute-1 nova_compute[183403]:     <apic/>
Jan 26 15:39:27 compute-1 nova_compute[183403]:     <vmcoreinfo/>
Jan 26 15:39:27 compute-1 nova_compute[183403]:   </features>
Jan 26 15:39:27 compute-1 nova_compute[183403]:   <clock offset="utc">
Jan 26 15:39:27 compute-1 nova_compute[183403]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 15:39:27 compute-1 nova_compute[183403]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 15:39:27 compute-1 nova_compute[183403]:     <timer name="hpet" present="no"/>
Jan 26 15:39:27 compute-1 nova_compute[183403]:   </clock>
Jan 26 15:39:27 compute-1 nova_compute[183403]:   <cpu mode="custom" match="exact">
Jan 26 15:39:27 compute-1 nova_compute[183403]:     <model>Nehalem</model>
Jan 26 15:39:27 compute-1 nova_compute[183403]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 15:39:27 compute-1 nova_compute[183403]:   </cpu>
Jan 26 15:39:27 compute-1 nova_compute[183403]:   <devices>
Jan 26 15:39:27 compute-1 nova_compute[183403]:     <disk type="file" device="disk">
Jan 26 15:39:27 compute-1 nova_compute[183403]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 15:39:27 compute-1 nova_compute[183403]:       <source file="/var/lib/nova/instances/b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab/disk"/>
Jan 26 15:39:27 compute-1 nova_compute[183403]:       <target dev="vda" bus="virtio"/>
Jan 26 15:39:27 compute-1 nova_compute[183403]:     </disk>
Jan 26 15:39:27 compute-1 nova_compute[183403]:     <disk type="file" device="cdrom">
Jan 26 15:39:27 compute-1 nova_compute[183403]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 15:39:27 compute-1 nova_compute[183403]:       <source file="/var/lib/nova/instances/b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab/disk.config"/>
Jan 26 15:39:27 compute-1 nova_compute[183403]:       <target dev="sda" bus="sata"/>
Jan 26 15:39:27 compute-1 nova_compute[183403]:     </disk>
Jan 26 15:39:27 compute-1 nova_compute[183403]:     <interface type="ethernet">
Jan 26 15:39:27 compute-1 nova_compute[183403]:       <mac address="fa:16:3e:9e:81:1e"/>
Jan 26 15:39:27 compute-1 nova_compute[183403]:       <model type="virtio"/>
Jan 26 15:39:27 compute-1 nova_compute[183403]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 15:39:27 compute-1 nova_compute[183403]:       <mtu size="1442"/>
Jan 26 15:39:27 compute-1 nova_compute[183403]:       <target dev="tapb0d5fa76-2c"/>
Jan 26 15:39:27 compute-1 nova_compute[183403]:     </interface>
Jan 26 15:39:27 compute-1 nova_compute[183403]:     <serial type="pty">
Jan 26 15:39:27 compute-1 nova_compute[183403]:       <log file="/var/lib/nova/instances/b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab/console.log" append="off"/>
Jan 26 15:39:27 compute-1 nova_compute[183403]:     </serial>
Jan 26 15:39:27 compute-1 nova_compute[183403]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 15:39:27 compute-1 nova_compute[183403]:     <video>
Jan 26 15:39:27 compute-1 nova_compute[183403]:       <model type="virtio"/>
Jan 26 15:39:27 compute-1 nova_compute[183403]:     </video>
Jan 26 15:39:27 compute-1 nova_compute[183403]:     <input type="tablet" bus="usb"/>
Jan 26 15:39:27 compute-1 nova_compute[183403]:     <rng model="virtio">
Jan 26 15:39:27 compute-1 nova_compute[183403]:       <backend model="random">/dev/urandom</backend>
Jan 26 15:39:27 compute-1 nova_compute[183403]:     </rng>
Jan 26 15:39:27 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root"/>
Jan 26 15:39:27 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:39:27 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:39:27 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:39:27 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:39:27 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:39:27 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:39:27 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:39:27 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:39:27 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:39:27 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:39:27 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:39:27 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:39:27 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:39:27 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:39:27 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:39:27 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:39:27 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:39:27 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:39:27 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:39:27 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:39:27 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:39:27 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:39:27 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:39:27 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:39:27 compute-1 nova_compute[183403]:     <controller type="usb" index="0"/>
Jan 26 15:39:27 compute-1 nova_compute[183403]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 15:39:27 compute-1 nova_compute[183403]:       <stats period="10"/>
Jan 26 15:39:27 compute-1 nova_compute[183403]:     </memballoon>
Jan 26 15:39:27 compute-1 nova_compute[183403]:   </devices>
Jan 26 15:39:27 compute-1 nova_compute[183403]: </domain>
Jan 26 15:39:27 compute-1 nova_compute[183403]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Jan 26 15:39:27 compute-1 nova_compute[183403]: 2026-01-26 15:39:27.588 183407 DEBUG nova.compute.manager [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] Preparing to wait for external event network-vif-plugged-b0d5fa76-2cd0-497f-9f26-af6ef0c5d9f4 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 26 15:39:27 compute-1 nova_compute[183403]: 2026-01-26 15:39:27.589 183407 DEBUG oslo_concurrency.lockutils [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Acquiring lock "b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:39:27 compute-1 nova_compute[183403]: 2026-01-26 15:39:27.589 183407 DEBUG oslo_concurrency.lockutils [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Lock "b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:39:27 compute-1 nova_compute[183403]: 2026-01-26 15:39:27.589 183407 DEBUG oslo_concurrency.lockutils [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Lock "b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:39:27 compute-1 nova_compute[183403]: 2026-01-26 15:39:27.590 183407 DEBUG nova.virt.libvirt.vif [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-01-26T15:39:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-810720634',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-810720634',id=31,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='af8eea38f1d74ad1a01087c020ea8d02',ramdisk_id='',reservation_id='r-hcbmag46',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-1233966703',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-1233966703-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T15:39:19Z,user_data=None,user_id='136d3cfdd6cb48e2ab65221bcc05d26c',uuid=b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b0d5fa76-2cd0-497f-9f26-af6ef0c5d9f4", "address": "fa:16:3e:9e:81:1e", "network": {"id": "d9d38847-a43b-4d1e-a0b1-c3f77a879374", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1542262107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "312beda09adb420b9f44a490d7257008", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0d5fa76-2c", "ovs_interfaceid": "b0d5fa76-2cd0-497f-9f26-af6ef0c5d9f4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 26 15:39:27 compute-1 nova_compute[183403]: 2026-01-26 15:39:27.590 183407 DEBUG nova.network.os_vif_util [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Converting VIF {"id": "b0d5fa76-2cd0-497f-9f26-af6ef0c5d9f4", "address": "fa:16:3e:9e:81:1e", "network": {"id": "d9d38847-a43b-4d1e-a0b1-c3f77a879374", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1542262107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "312beda09adb420b9f44a490d7257008", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0d5fa76-2c", "ovs_interfaceid": "b0d5fa76-2cd0-497f-9f26-af6ef0c5d9f4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:39:27 compute-1 nova_compute[183403]: 2026-01-26 15:39:27.591 183407 DEBUG nova.network.os_vif_util [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9e:81:1e,bridge_name='br-int',has_traffic_filtering=True,id=b0d5fa76-2cd0-497f-9f26-af6ef0c5d9f4,network=Network(d9d38847-a43b-4d1e-a0b1-c3f77a879374),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb0d5fa76-2c') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:39:27 compute-1 nova_compute[183403]: 2026-01-26 15:39:27.592 183407 DEBUG os_vif [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9e:81:1e,bridge_name='br-int',has_traffic_filtering=True,id=b0d5fa76-2cd0-497f-9f26-af6ef0c5d9f4,network=Network(d9d38847-a43b-4d1e-a0b1-c3f77a879374),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb0d5fa76-2c') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 26 15:39:27 compute-1 nova_compute[183403]: 2026-01-26 15:39:27.592 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:39:27 compute-1 nova_compute[183403]: 2026-01-26 15:39:27.592 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:39:27 compute-1 nova_compute[183403]: 2026-01-26 15:39:27.593 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:39:27 compute-1 nova_compute[183403]: 2026-01-26 15:39:27.593 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:39:27 compute-1 nova_compute[183403]: 2026-01-26 15:39:27.594 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '1a48a4c1-afa9-5515-9936-2d6d587904e9', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:39:27 compute-1 nova_compute[183403]: 2026-01-26 15:39:27.595 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:39:27 compute-1 nova_compute[183403]: 2026-01-26 15:39:27.596 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:39:27 compute-1 nova_compute[183403]: 2026-01-26 15:39:27.598 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:39:27 compute-1 nova_compute[183403]: 2026-01-26 15:39:27.598 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb0d5fa76-2c, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:39:27 compute-1 nova_compute[183403]: 2026-01-26 15:39:27.598 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapb0d5fa76-2c, col_values=(('qos', UUID('4f12d460-57e5-4c96-9a1b-880c43edee4d')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:39:27 compute-1 nova_compute[183403]: 2026-01-26 15:39:27.598 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapb0d5fa76-2c, col_values=(('external_ids', {'iface-id': 'b0d5fa76-2cd0-497f-9f26-af6ef0c5d9f4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9e:81:1e', 'vm-uuid': 'b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:39:27 compute-1 nova_compute[183403]: 2026-01-26 15:39:27.599 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:39:27 compute-1 NetworkManager[55716]: <info>  [1769441967.6006] manager: (tapb0d5fa76-2c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/87)
Jan 26 15:39:27 compute-1 nova_compute[183403]: 2026-01-26 15:39:27.601 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:39:27 compute-1 nova_compute[183403]: 2026-01-26 15:39:27.605 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:39:27 compute-1 nova_compute[183403]: 2026-01-26 15:39:27.606 183407 INFO os_vif [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9e:81:1e,bridge_name='br-int',has_traffic_filtering=True,id=b0d5fa76-2cd0-497f-9f26-af6ef0c5d9f4,network=Network(d9d38847-a43b-4d1e-a0b1-c3f77a879374),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb0d5fa76-2c')
Jan 26 15:39:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:39:29.098 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:39:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:39:29.098 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:39:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:39:29.098 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:39:29 compute-1 nova_compute[183403]: 2026-01-26 15:39:29.159 183407 DEBUG nova.virt.libvirt.driver [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 15:39:29 compute-1 nova_compute[183403]: 2026-01-26 15:39:29.160 183407 DEBUG nova.virt.libvirt.driver [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 15:39:29 compute-1 nova_compute[183403]: 2026-01-26 15:39:29.160 183407 DEBUG nova.virt.libvirt.driver [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] No VIF found with MAC fa:16:3e:9e:81:1e, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Jan 26 15:39:29 compute-1 nova_compute[183403]: 2026-01-26 15:39:29.162 183407 INFO nova.virt.libvirt.driver [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] Using config drive
Jan 26 15:39:29 compute-1 nova_compute[183403]: 2026-01-26 15:39:29.678 183407 WARNING neutronclient.v2_0.client [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:39:30 compute-1 nova_compute[183403]: 2026-01-26 15:39:30.572 183407 INFO nova.virt.libvirt.driver [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] Creating config drive at /var/lib/nova/instances/b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab/disk.config
Jan 26 15:39:30 compute-1 nova_compute[183403]: 2026-01-26 15:39:30.579 183407 DEBUG oslo_concurrency.processutils [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpbn88p70e execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:39:30 compute-1 nova_compute[183403]: 2026-01-26 15:39:30.586 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:39:30 compute-1 nova_compute[183403]: 2026-01-26 15:39:30.724 183407 DEBUG oslo_concurrency.processutils [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpbn88p70e" returned: 0 in 0.145s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:39:31 compute-1 kernel: tapb0d5fa76-2c: entered promiscuous mode
Jan 26 15:39:30 compute-1 NetworkManager[55716]: <info>  [1769441970.8122] manager: (tapb0d5fa76-2c): new Tun device (/org/freedesktop/NetworkManager/Devices/88)
Jan 26 15:39:31 compute-1 nova_compute[183403]: 2026-01-26 15:39:31.657 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:39:31 compute-1 nova_compute[183403]: 2026-01-26 15:39:31.658 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:39:31 compute-1 nova_compute[183403]: 2026-01-26 15:39:31.658 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:39:31 compute-1 nova_compute[183403]: 2026-01-26 15:39:31.659 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:39:31 compute-1 nova_compute[183403]: 2026-01-26 15:39:31.659 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:39:31 compute-1 ovn_controller[95641]: 2026-01-26T15:39:31Z|00236|binding|INFO|Claiming lport b0d5fa76-2cd0-497f-9f26-af6ef0c5d9f4 for this chassis.
Jan 26 15:39:31 compute-1 ovn_controller[95641]: 2026-01-26T15:39:31Z|00237|binding|INFO|b0d5fa76-2cd0-497f-9f26-af6ef0c5d9f4: Claiming fa:16:3e:9e:81:1e 10.100.0.6
Jan 26 15:39:31 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:39:31.677 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9e:81:1e 10.100.0.6'], port_security=['fa:16:3e:9e:81:1e 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d9d38847-a43b-4d1e-a0b1-c3f77a879374', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'af8eea38f1d74ad1a01087c020ea8d02', 'neutron:revision_number': '4', 'neutron:security_group_ids': '41d84080-c1ba-42a6-8417-b57b30232ea3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f2b7e03b-5278-4177-91b7-862e57a7c9ad, chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], logical_port=b0d5fa76-2cd0-497f-9f26-af6ef0c5d9f4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:39:31 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:39:31.678 104930 INFO neutron.agent.ovn.metadata.agent [-] Port b0d5fa76-2cd0-497f-9f26-af6ef0c5d9f4 in datapath d9d38847-a43b-4d1e-a0b1-c3f77a879374 bound to our chassis
Jan 26 15:39:31 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:39:31.679 104930 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d9d38847-a43b-4d1e-a0b1-c3f77a879374
Jan 26 15:39:31 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:39:31.693 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[d923cf7f-e5d2-4d94-b341-19faa7a6c198]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:39:31 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:39:31.694 104930 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd9d38847-a1 in ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Jan 26 15:39:31 compute-1 systemd-udevd[214910]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 15:39:31 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:39:31.697 203506 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd9d38847-a0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Jan 26 15:39:31 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:39:31.697 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[bc386794-b830-483b-83ed-55203af85d43]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:39:31 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:39:31.698 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[06ef53f5-814b-45e6-92bd-f1081805f118]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:39:31 compute-1 NetworkManager[55716]: <info>  [1769441971.7119] device (tapb0d5fa76-2c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 15:39:31 compute-1 NetworkManager[55716]: <info>  [1769441971.7125] device (tapb0d5fa76-2c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 15:39:31 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:39:31.717 105448 DEBUG oslo.privsep.daemon [-] privsep: reply[5ba39b28-6619-471c-8299-2a1f7dcc72dc]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:39:31 compute-1 systemd-machined[154697]: New machine qemu-22-instance-0000001f.
Jan 26 15:39:31 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:39:31.747 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[3e027691-b08d-40ad-b494-cf66211a4bd2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:39:31 compute-1 nova_compute[183403]: 2026-01-26 15:39:31.750 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:39:31 compute-1 systemd[1]: Started Virtual Machine qemu-22-instance-0000001f.
Jan 26 15:39:31 compute-1 ovn_controller[95641]: 2026-01-26T15:39:31Z|00238|binding|INFO|Setting lport b0d5fa76-2cd0-497f-9f26-af6ef0c5d9f4 ovn-installed in OVS
Jan 26 15:39:31 compute-1 ovn_controller[95641]: 2026-01-26T15:39:31Z|00239|binding|INFO|Setting lport b0d5fa76-2cd0-497f-9f26-af6ef0c5d9f4 up in Southbound
Jan 26 15:39:31 compute-1 nova_compute[183403]: 2026-01-26 15:39:31.755 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:39:31 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:39:31.782 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[accd3764-8583-45d7-a6dc-7abfe62a774c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:39:31 compute-1 NetworkManager[55716]: <info>  [1769441971.7907] manager: (tapd9d38847-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/89)
Jan 26 15:39:31 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:39:31.789 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[d9c864eb-61ae-4bbd-aa42-dc1159f0af2e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:39:31 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:39:31.830 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[79095d75-a3db-4182-a508-9a51d3276ac3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:39:31 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:39:31.834 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[47c72c18-3e34-4f87-9776-11c021add594]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:39:31 compute-1 NetworkManager[55716]: <info>  [1769441971.8638] device (tapd9d38847-a0): carrier: link connected
Jan 26 15:39:31 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:39:31.873 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[a2008739-1d09-4d35-bff4-8b4dcc0f9112]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:39:31 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:39:31.892 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[d14d61fc-cddd-479e-8b7a-34f914823f0f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd9d38847-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:38:11:27'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 66], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 569446, 'reachable_time': 31143, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214945, 'error': None, 'target': 'ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:39:31 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:39:31.907 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[7babd689-6747-4b98-a814-4d73e38dc408]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe38:1127'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 569446, 'tstamp': 569446}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214946, 'error': None, 'target': 'ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:39:31 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:39:31.928 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[fa546d35-195f-4167-ace9-45d1b1e46d19]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd9d38847-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:38:11:27'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 66], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 569446, 'reachable_time': 31143, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214947, 'error': None, 'target': 'ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:39:31 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:39:31.979 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[82c1c9b4-923a-4598-b8a3-f9a03d7292b5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:39:32 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:39:32.131 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[a4befad0-2594-4572-a65e-9dc427979524]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:39:32 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:39:32.133 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd9d38847-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:39:32 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:39:32.133 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:39:32 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:39:32.134 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd9d38847-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:39:32 compute-1 nova_compute[183403]: 2026-01-26 15:39:32.136 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:39:32 compute-1 kernel: tapd9d38847-a0: entered promiscuous mode
Jan 26 15:39:32 compute-1 NetworkManager[55716]: <info>  [1769441972.1380] manager: (tapd9d38847-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/90)
Jan 26 15:39:32 compute-1 nova_compute[183403]: 2026-01-26 15:39:32.141 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:39:32 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:39:32.142 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd9d38847-a0, col_values=(('external_ids', {'iface-id': '70e6ec0a-21db-4d83-b0cb-0624424ede18'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:39:32 compute-1 nova_compute[183403]: 2026-01-26 15:39:32.144 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:39:32 compute-1 ovn_controller[95641]: 2026-01-26T15:39:32Z|00240|binding|INFO|Releasing lport 70e6ec0a-21db-4d83-b0cb-0624424ede18 from this chassis (sb_readonly=0)
Jan 26 15:39:32 compute-1 nova_compute[183403]: 2026-01-26 15:39:32.172 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:39:32 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:39:32.175 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[bc7f305e-85c9-4253-a2a1-8af1a5caabfc]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:39:32 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:39:32.175 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d9d38847-a43b-4d1e-a0b1-c3f77a879374.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d9d38847-a43b-4d1e-a0b1-c3f77a879374.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:39:32 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:39:32.176 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d9d38847-a43b-4d1e-a0b1-c3f77a879374.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d9d38847-a43b-4d1e-a0b1-c3f77a879374.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:39:32 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:39:32.176 104930 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for d9d38847-a43b-4d1e-a0b1-c3f77a879374 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Jan 26 15:39:32 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:39:32.176 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d9d38847-a43b-4d1e-a0b1-c3f77a879374.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d9d38847-a43b-4d1e-a0b1-c3f77a879374.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:39:32 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:39:32.176 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[8798f8db-1d44-498e-ac51-c76d319705e5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:39:32 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:39:32.177 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d9d38847-a43b-4d1e-a0b1-c3f77a879374.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d9d38847-a43b-4d1e-a0b1-c3f77a879374.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:39:32 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:39:32.178 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[72cc0578-7053-4eb6-bbd1-02fe318d540d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:39:32 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:39:32.178 104930 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Jan 26 15:39:32 compute-1 ovn_metadata_agent[104924]: global
Jan 26 15:39:32 compute-1 ovn_metadata_agent[104924]:     log         /dev/log local0 debug
Jan 26 15:39:32 compute-1 ovn_metadata_agent[104924]:     log-tag     haproxy-metadata-proxy-d9d38847-a43b-4d1e-a0b1-c3f77a879374
Jan 26 15:39:32 compute-1 ovn_metadata_agent[104924]:     user        root
Jan 26 15:39:32 compute-1 ovn_metadata_agent[104924]:     group       root
Jan 26 15:39:32 compute-1 ovn_metadata_agent[104924]:     maxconn     1024
Jan 26 15:39:32 compute-1 ovn_metadata_agent[104924]:     pidfile     /var/lib/neutron/external/pids/d9d38847-a43b-4d1e-a0b1-c3f77a879374.pid.haproxy
Jan 26 15:39:32 compute-1 ovn_metadata_agent[104924]:     daemon
Jan 26 15:39:32 compute-1 ovn_metadata_agent[104924]: 
Jan 26 15:39:32 compute-1 ovn_metadata_agent[104924]: defaults
Jan 26 15:39:32 compute-1 ovn_metadata_agent[104924]:     log global
Jan 26 15:39:32 compute-1 ovn_metadata_agent[104924]:     mode http
Jan 26 15:39:32 compute-1 ovn_metadata_agent[104924]:     option httplog
Jan 26 15:39:32 compute-1 ovn_metadata_agent[104924]:     option dontlognull
Jan 26 15:39:32 compute-1 ovn_metadata_agent[104924]:     option http-server-close
Jan 26 15:39:32 compute-1 ovn_metadata_agent[104924]:     option forwardfor
Jan 26 15:39:32 compute-1 ovn_metadata_agent[104924]:     retries                 3
Jan 26 15:39:32 compute-1 ovn_metadata_agent[104924]:     timeout http-request    30s
Jan 26 15:39:32 compute-1 ovn_metadata_agent[104924]:     timeout connect         30s
Jan 26 15:39:32 compute-1 ovn_metadata_agent[104924]:     timeout client          32s
Jan 26 15:39:32 compute-1 ovn_metadata_agent[104924]:     timeout server          32s
Jan 26 15:39:32 compute-1 ovn_metadata_agent[104924]:     timeout http-keep-alive 30s
Jan 26 15:39:32 compute-1 ovn_metadata_agent[104924]: 
Jan 26 15:39:32 compute-1 ovn_metadata_agent[104924]: listen listener
Jan 26 15:39:32 compute-1 ovn_metadata_agent[104924]:     bind 169.254.169.254:80
Jan 26 15:39:32 compute-1 ovn_metadata_agent[104924]:     
Jan 26 15:39:32 compute-1 ovn_metadata_agent[104924]:     server metadata /var/lib/neutron/metadata_proxy
Jan 26 15:39:32 compute-1 ovn_metadata_agent[104924]: 
Jan 26 15:39:32 compute-1 ovn_metadata_agent[104924]:     http-request add-header X-OVN-Network-ID d9d38847-a43b-4d1e-a0b1-c3f77a879374
Jan 26 15:39:32 compute-1 ovn_metadata_agent[104924]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Jan 26 15:39:32 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:39:32.179 104930 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374', 'env', 'PROCESS_TAG=haproxy-d9d38847-a43b-4d1e-a0b1-c3f77a879374', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d9d38847-a43b-4d1e-a0b1-c3f77a879374.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Jan 26 15:39:32 compute-1 nova_compute[183403]: 2026-01-26 15:39:32.207 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:39:32 compute-1 nova_compute[183403]: 2026-01-26 15:39:32.574 183407 DEBUG nova.compute.manager [req-7c11d431-df7a-4cf0-ba8a-aeaac992800a req-33771b8a-8b69-494b-83dd-f6a05f91779d 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] Received event network-vif-plugged-b0d5fa76-2cd0-497f-9f26-af6ef0c5d9f4 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:39:32 compute-1 nova_compute[183403]: 2026-01-26 15:39:32.575 183407 DEBUG oslo_concurrency.lockutils [req-7c11d431-df7a-4cf0-ba8a-aeaac992800a req-33771b8a-8b69-494b-83dd-f6a05f91779d 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:39:32 compute-1 nova_compute[183403]: 2026-01-26 15:39:32.575 183407 DEBUG oslo_concurrency.lockutils [req-7c11d431-df7a-4cf0-ba8a-aeaac992800a req-33771b8a-8b69-494b-83dd-f6a05f91779d 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:39:32 compute-1 nova_compute[183403]: 2026-01-26 15:39:32.576 183407 DEBUG oslo_concurrency.lockutils [req-7c11d431-df7a-4cf0-ba8a-aeaac992800a req-33771b8a-8b69-494b-83dd-f6a05f91779d 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:39:32 compute-1 nova_compute[183403]: 2026-01-26 15:39:32.576 183407 DEBUG nova.compute.manager [req-7c11d431-df7a-4cf0-ba8a-aeaac992800a req-33771b8a-8b69-494b-83dd-f6a05f91779d 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] Processing event network-vif-plugged-b0d5fa76-2cd0-497f-9f26-af6ef0c5d9f4 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 26 15:39:32 compute-1 nova_compute[183403]: 2026-01-26 15:39:32.577 183407 DEBUG nova.compute.manager [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 26 15:39:32 compute-1 nova_compute[183403]: 2026-01-26 15:39:32.580 183407 DEBUG nova.virt.libvirt.driver [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Jan 26 15:39:32 compute-1 nova_compute[183403]: 2026-01-26 15:39:32.582 183407 INFO nova.virt.libvirt.driver [-] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] Instance spawned successfully.
Jan 26 15:39:32 compute-1 nova_compute[183403]: 2026-01-26 15:39:32.582 183407 DEBUG nova.virt.libvirt.driver [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Jan 26 15:39:32 compute-1 nova_compute[183403]: 2026-01-26 15:39:32.600 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:39:32 compute-1 podman[214988]: 2026-01-26 15:39:32.546288374 +0000 UTC m=+0.021688623 image pull d5bf96c5225682608353c2a38183b39c74c7c48343b54a579b3b6f3d81996637 38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Jan 26 15:39:32 compute-1 nova_compute[183403]: 2026-01-26 15:39:32.872 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:39:32 compute-1 nova_compute[183403]: 2026-01-26 15:39:32.949 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:39:32 compute-1 nova_compute[183403]: 2026-01-26 15:39:32.950 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:39:33 compute-1 nova_compute[183403]: 2026-01-26 15:39:33.013 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:39:33 compute-1 nova_compute[183403]: 2026-01-26 15:39:33.095 183407 DEBUG nova.virt.libvirt.driver [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:39:33 compute-1 nova_compute[183403]: 2026-01-26 15:39:33.096 183407 DEBUG nova.virt.libvirt.driver [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:39:33 compute-1 nova_compute[183403]: 2026-01-26 15:39:33.097 183407 DEBUG nova.virt.libvirt.driver [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:39:33 compute-1 nova_compute[183403]: 2026-01-26 15:39:33.097 183407 DEBUG nova.virt.libvirt.driver [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:39:33 compute-1 nova_compute[183403]: 2026-01-26 15:39:33.098 183407 DEBUG nova.virt.libvirt.driver [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:39:33 compute-1 nova_compute[183403]: 2026-01-26 15:39:33.098 183407 DEBUG nova.virt.libvirt.driver [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:39:33 compute-1 nova_compute[183403]: 2026-01-26 15:39:33.211 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:39:33 compute-1 nova_compute[183403]: 2026-01-26 15:39:33.212 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:39:33 compute-1 nova_compute[183403]: 2026-01-26 15:39:33.254 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:39:33 compute-1 nova_compute[183403]: 2026-01-26 15:39:33.255 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5812MB free_disk=73.14419174194336GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:39:33 compute-1 nova_compute[183403]: 2026-01-26 15:39:33.255 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:39:33 compute-1 nova_compute[183403]: 2026-01-26 15:39:33.256 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:39:33 compute-1 podman[214988]: 2026-01-26 15:39:33.353698807 +0000 UTC m=+0.829099036 container create 286f6d0bf3b2b7c9acc678871946577d1a8e4bccb012bd62f5123383d6e12dd3 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, io.buildah.version=1.41.4)
Jan 26 15:39:33 compute-1 systemd[1]: Started libpod-conmon-286f6d0bf3b2b7c9acc678871946577d1a8e4bccb012bd62f5123383d6e12dd3.scope.
Jan 26 15:39:33 compute-1 nova_compute[183403]: 2026-01-26 15:39:33.610 183407 INFO nova.compute.manager [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] Took 12.69 seconds to spawn the instance on the hypervisor.
Jan 26 15:39:33 compute-1 nova_compute[183403]: 2026-01-26 15:39:33.610 183407 DEBUG nova.compute.manager [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Jan 26 15:39:33 compute-1 systemd[1]: Started libcrun container.
Jan 26 15:39:33 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7ff6f30108b7d1787f5ecdcf0b399b7b3b1609a681826d9faeb46638ec70da0a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 15:39:33 compute-1 podman[214988]: 2026-01-26 15:39:33.980797325 +0000 UTC m=+1.456197574 container init 286f6d0bf3b2b7c9acc678871946577d1a8e4bccb012bd62f5123383d6e12dd3 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest, tcib_managed=true)
Jan 26 15:39:33 compute-1 podman[214988]: 2026-01-26 15:39:33.987472663 +0000 UTC m=+1.462872892 container start 286f6d0bf3b2b7c9acc678871946577d1a8e4bccb012bd62f5123383d6e12dd3 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4)
Jan 26 15:39:34 compute-1 neutron-haproxy-ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374[215011]: [NOTICE]   (215015) : New worker (215017) forked
Jan 26 15:39:34 compute-1 neutron-haproxy-ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374[215011]: [NOTICE]   (215015) : Loading success.
Jan 26 15:39:34 compute-1 nova_compute[183403]: 2026-01-26 15:39:34.323 183407 INFO nova.compute.manager [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] Took 18.52 seconds to build instance.
Jan 26 15:39:34 compute-1 nova_compute[183403]: 2026-01-26 15:39:34.345 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Instance b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 15:39:34 compute-1 nova_compute[183403]: 2026-01-26 15:39:34.346 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:39:34 compute-1 nova_compute[183403]: 2026-01-26 15:39:34.347 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:39:33 up  1:34,  0 user,  load average: 0.05, 0.11, 0.17\n', 'num_instances': '1', 'num_vm_building': '1', 'num_task_spawning': '1', 'num_os_type_None': '1', 'num_proj_af8eea38f1d74ad1a01087c020ea8d02': '1', 'io_workload': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:39:34 compute-1 nova_compute[183403]: 2026-01-26 15:39:34.385 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:39:34 compute-1 nova_compute[183403]: 2026-01-26 15:39:34.629 183407 DEBUG nova.compute.manager [req-d8a39663-3a98-433d-80a7-aad90d1ca7a5 req-17cfa090-28e1-4540-a7f1-63c7e5ee36fa 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] Received event network-vif-plugged-b0d5fa76-2cd0-497f-9f26-af6ef0c5d9f4 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:39:34 compute-1 nova_compute[183403]: 2026-01-26 15:39:34.630 183407 DEBUG oslo_concurrency.lockutils [req-d8a39663-3a98-433d-80a7-aad90d1ca7a5 req-17cfa090-28e1-4540-a7f1-63c7e5ee36fa 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:39:34 compute-1 nova_compute[183403]: 2026-01-26 15:39:34.630 183407 DEBUG oslo_concurrency.lockutils [req-d8a39663-3a98-433d-80a7-aad90d1ca7a5 req-17cfa090-28e1-4540-a7f1-63c7e5ee36fa 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:39:34 compute-1 nova_compute[183403]: 2026-01-26 15:39:34.631 183407 DEBUG oslo_concurrency.lockutils [req-d8a39663-3a98-433d-80a7-aad90d1ca7a5 req-17cfa090-28e1-4540-a7f1-63c7e5ee36fa 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:39:34 compute-1 nova_compute[183403]: 2026-01-26 15:39:34.631 183407 DEBUG nova.compute.manager [req-d8a39663-3a98-433d-80a7-aad90d1ca7a5 req-17cfa090-28e1-4540-a7f1-63c7e5ee36fa 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] No waiting events found dispatching network-vif-plugged-b0d5fa76-2cd0-497f-9f26-af6ef0c5d9f4 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:39:34 compute-1 nova_compute[183403]: 2026-01-26 15:39:34.631 183407 WARNING nova.compute.manager [req-d8a39663-3a98-433d-80a7-aad90d1ca7a5 req-17cfa090-28e1-4540-a7f1-63c7e5ee36fa 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] Received unexpected event network-vif-plugged-b0d5fa76-2cd0-497f-9f26-af6ef0c5d9f4 for instance with vm_state active and task_state None.
Jan 26 15:39:34 compute-1 nova_compute[183403]: 2026-01-26 15:39:34.830 183407 DEBUG oslo_concurrency.lockutils [None req-1bbddfb6-d2bb-46e8-a79b-9bf28aa122a9 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Lock "b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 20.040s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:39:34 compute-1 nova_compute[183403]: 2026-01-26 15:39:34.895 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:39:35 compute-1 nova_compute[183403]: 2026-01-26 15:39:35.405 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:39:35 compute-1 nova_compute[183403]: 2026-01-26 15:39:35.406 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.151s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:39:35 compute-1 podman[192725]: time="2026-01-26T15:39:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:39:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:39:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16575 "" "Go-http-client/1.1"
Jan 26 15:39:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:39:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2652 "" "Go-http-client/1.1"
Jan 26 15:39:36 compute-1 nova_compute[183403]: 2026-01-26 15:39:36.397 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:39:36 compute-1 nova_compute[183403]: 2026-01-26 15:39:36.399 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:39:36 compute-1 nova_compute[183403]: 2026-01-26 15:39:36.399 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:39:36 compute-1 nova_compute[183403]: 2026-01-26 15:39:36.399 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:39:36 compute-1 nova_compute[183403]: 2026-01-26 15:39:36.399 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:39:36 compute-1 nova_compute[183403]: 2026-01-26 15:39:36.400 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:39:36 compute-1 nova_compute[183403]: 2026-01-26 15:39:36.400 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 15:39:37 compute-1 nova_compute[183403]: 2026-01-26 15:39:37.213 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:39:37 compute-1 nova_compute[183403]: 2026-01-26 15:39:37.603 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:39:40 compute-1 podman[215027]: 2026-01-26 15:39:40.907070939 +0000 UTC m=+0.071168461 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, name=ubi9-minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container)
Jan 26 15:39:40 compute-1 podman[215026]: 2026-01-26 15:39:40.928756782 +0000 UTC m=+0.093721919 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 26 15:39:42 compute-1 nova_compute[183403]: 2026-01-26 15:39:42.214 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:39:42 compute-1 nova_compute[183403]: 2026-01-26 15:39:42.672 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:39:47 compute-1 nova_compute[183403]: 2026-01-26 15:39:47.217 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:39:47 compute-1 nova_compute[183403]: 2026-01-26 15:39:47.674 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:39:48 compute-1 ovn_controller[95641]: 2026-01-26T15:39:48Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:9e:81:1e 10.100.0.6
Jan 26 15:39:48 compute-1 ovn_controller[95641]: 2026-01-26T15:39:48Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:9e:81:1e 10.100.0.6
Jan 26 15:39:48 compute-1 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 26 15:39:48 compute-1 nova_compute[183403]: 2026-01-26 15:39:48.443 183407 DEBUG nova.virt.libvirt.driver [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: bce6bafc-d40f-4b73-ac98-65b3105eb77f] Creating tmpfile /var/lib/nova/instances/tmp9rgzyne_ to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Jan 26 15:39:48 compute-1 nova_compute[183403]: 2026-01-26 15:39:48.445 183407 WARNING neutronclient.v2_0.client [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:39:48 compute-1 nova_compute[183403]: 2026-01-26 15:39:48.458 183407 DEBUG nova.compute.manager [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp9rgzyne_',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9090
Jan 26 15:39:49 compute-1 openstack_network_exporter[195610]: ERROR   15:39:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:39:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:39:49 compute-1 openstack_network_exporter[195610]: ERROR   15:39:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:39:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:39:50 compute-1 nova_compute[183403]: 2026-01-26 15:39:50.505 183407 WARNING neutronclient.v2_0.client [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:39:51 compute-1 podman[215086]: 2026-01-26 15:39:51.893482629 +0000 UTC m=+0.062460966 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest)
Jan 26 15:39:51 compute-1 podman[215085]: 2026-01-26 15:39:51.927593183 +0000 UTC m=+0.097442034 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 15:39:52 compute-1 nova_compute[183403]: 2026-01-26 15:39:52.218 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:39:52 compute-1 nova_compute[183403]: 2026-01-26 15:39:52.677 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:39:54 compute-1 nova_compute[183403]: 2026-01-26 15:39:54.564 183407 DEBUG nova.compute.manager [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp9rgzyne_',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='bce6bafc-d40f-4b73-ac98-65b3105eb77f',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9315
Jan 26 15:39:55 compute-1 nova_compute[183403]: 2026-01-26 15:39:55.582 183407 DEBUG oslo_concurrency.lockutils [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "refresh_cache-bce6bafc-d40f-4b73-ac98-65b3105eb77f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 15:39:55 compute-1 nova_compute[183403]: 2026-01-26 15:39:55.583 183407 DEBUG oslo_concurrency.lockutils [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquired lock "refresh_cache-bce6bafc-d40f-4b73-ac98-65b3105eb77f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 15:39:55 compute-1 nova_compute[183403]: 2026-01-26 15:39:55.583 183407 DEBUG nova.network.neutron [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: bce6bafc-d40f-4b73-ac98-65b3105eb77f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 15:39:56 compute-1 nova_compute[183403]: 2026-01-26 15:39:56.095 183407 WARNING neutronclient.v2_0.client [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:39:56 compute-1 nova_compute[183403]: 2026-01-26 15:39:56.787 183407 WARNING neutronclient.v2_0.client [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:39:56 compute-1 nova_compute[183403]: 2026-01-26 15:39:56.940 183407 DEBUG nova.network.neutron [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: bce6bafc-d40f-4b73-ac98-65b3105eb77f] Updating instance_info_cache with network_info: [{"id": "9e5208f0-f7da-4ff3-9f0d-8fa28072d466", "address": "fa:16:3e:e4:55:f5", "network": {"id": "d9d38847-a43b-4d1e-a0b1-c3f77a879374", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1542262107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "312beda09adb420b9f44a490d7257008", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e5208f0-f7", "ovs_interfaceid": "9e5208f0-f7da-4ff3-9f0d-8fa28072d466", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:39:57 compute-1 nova_compute[183403]: 2026-01-26 15:39:57.220 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:39:57 compute-1 nova_compute[183403]: 2026-01-26 15:39:57.446 183407 DEBUG oslo_concurrency.lockutils [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Releasing lock "refresh_cache-bce6bafc-d40f-4b73-ac98-65b3105eb77f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 15:39:57 compute-1 nova_compute[183403]: 2026-01-26 15:39:57.474 183407 DEBUG nova.virt.libvirt.driver [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: bce6bafc-d40f-4b73-ac98-65b3105eb77f] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp9rgzyne_',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='bce6bafc-d40f-4b73-ac98-65b3105eb77f',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Jan 26 15:39:57 compute-1 nova_compute[183403]: 2026-01-26 15:39:57.475 183407 DEBUG nova.virt.libvirt.driver [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: bce6bafc-d40f-4b73-ac98-65b3105eb77f] Creating instance directory: /var/lib/nova/instances/bce6bafc-d40f-4b73-ac98-65b3105eb77f pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Jan 26 15:39:57 compute-1 nova_compute[183403]: 2026-01-26 15:39:57.476 183407 DEBUG nova.virt.libvirt.driver [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: bce6bafc-d40f-4b73-ac98-65b3105eb77f] Creating disk.info with the contents: {'/var/lib/nova/instances/bce6bafc-d40f-4b73-ac98-65b3105eb77f/disk': 'qcow2', '/var/lib/nova/instances/bce6bafc-d40f-4b73-ac98-65b3105eb77f/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Jan 26 15:39:57 compute-1 nova_compute[183403]: 2026-01-26 15:39:57.476 183407 DEBUG nova.virt.libvirt.driver [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: bce6bafc-d40f-4b73-ac98-65b3105eb77f] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Jan 26 15:39:57 compute-1 nova_compute[183403]: 2026-01-26 15:39:57.478 183407 DEBUG nova.objects.instance [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lazy-loading 'trusted_certs' on Instance uuid bce6bafc-d40f-4b73-ac98-65b3105eb77f obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:39:57 compute-1 nova_compute[183403]: 2026-01-26 15:39:57.678 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:39:57 compute-1 nova_compute[183403]: 2026-01-26 15:39:57.986 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:39:57 compute-1 nova_compute[183403]: 2026-01-26 15:39:57.991 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:39:57 compute-1 nova_compute[183403]: 2026-01-26 15:39:57.994 183407 DEBUG oslo_concurrency.processutils [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:39:58 compute-1 nova_compute[183403]: 2026-01-26 15:39:58.055 183407 DEBUG oslo_concurrency.processutils [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:39:58 compute-1 nova_compute[183403]: 2026-01-26 15:39:58.056 183407 DEBUG oslo_concurrency.lockutils [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:39:58 compute-1 nova_compute[183403]: 2026-01-26 15:39:58.057 183407 DEBUG oslo_concurrency.lockutils [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:39:58 compute-1 nova_compute[183403]: 2026-01-26 15:39:58.058 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:39:58 compute-1 nova_compute[183403]: 2026-01-26 15:39:58.061 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:39:58 compute-1 nova_compute[183403]: 2026-01-26 15:39:58.062 183407 DEBUG oslo_concurrency.processutils [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:39:58 compute-1 nova_compute[183403]: 2026-01-26 15:39:58.120 183407 DEBUG oslo_concurrency.processutils [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:39:58 compute-1 nova_compute[183403]: 2026-01-26 15:39:58.122 183407 DEBUG oslo_concurrency.processutils [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0,backing_fmt=raw /var/lib/nova/instances/bce6bafc-d40f-4b73-ac98-65b3105eb77f/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:39:58 compute-1 nova_compute[183403]: 2026-01-26 15:39:58.163 183407 DEBUG oslo_concurrency.processutils [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0,backing_fmt=raw /var/lib/nova/instances/bce6bafc-d40f-4b73-ac98-65b3105eb77f/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:39:58 compute-1 nova_compute[183403]: 2026-01-26 15:39:58.165 183407 DEBUG oslo_concurrency.lockutils [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:39:58 compute-1 nova_compute[183403]: 2026-01-26 15:39:58.166 183407 DEBUG oslo_concurrency.processutils [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:39:58 compute-1 nova_compute[183403]: 2026-01-26 15:39:58.235 183407 DEBUG oslo_concurrency.processutils [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:39:58 compute-1 nova_compute[183403]: 2026-01-26 15:39:58.236 183407 DEBUG nova.virt.disk.api [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Checking if we can resize image /var/lib/nova/instances/bce6bafc-d40f-4b73-ac98-65b3105eb77f/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 26 15:39:58 compute-1 nova_compute[183403]: 2026-01-26 15:39:58.237 183407 DEBUG oslo_concurrency.processutils [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bce6bafc-d40f-4b73-ac98-65b3105eb77f/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:39:58 compute-1 nova_compute[183403]: 2026-01-26 15:39:58.304 183407 DEBUG oslo_concurrency.processutils [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bce6bafc-d40f-4b73-ac98-65b3105eb77f/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:39:58 compute-1 nova_compute[183403]: 2026-01-26 15:39:58.306 183407 DEBUG nova.virt.disk.api [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Cannot resize image /var/lib/nova/instances/bce6bafc-d40f-4b73-ac98-65b3105eb77f/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 26 15:39:58 compute-1 nova_compute[183403]: 2026-01-26 15:39:58.307 183407 DEBUG nova.objects.instance [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lazy-loading 'migration_context' on Instance uuid bce6bafc-d40f-4b73-ac98-65b3105eb77f obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:39:58 compute-1 nova_compute[183403]: 2026-01-26 15:39:58.814 183407 DEBUG nova.objects.base [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Object Instance<bce6bafc-d40f-4b73-ac98-65b3105eb77f> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Jan 26 15:39:58 compute-1 nova_compute[183403]: 2026-01-26 15:39:58.814 183407 DEBUG oslo_concurrency.processutils [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/bce6bafc-d40f-4b73-ac98-65b3105eb77f/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:39:58 compute-1 nova_compute[183403]: 2026-01-26 15:39:58.841 183407 DEBUG oslo_concurrency.processutils [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/bce6bafc-d40f-4b73-ac98-65b3105eb77f/disk.config 497664" returned: 0 in 0.027s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:39:58 compute-1 nova_compute[183403]: 2026-01-26 15:39:58.842 183407 DEBUG nova.virt.libvirt.driver [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: bce6bafc-d40f-4b73-ac98-65b3105eb77f] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Jan 26 15:39:58 compute-1 nova_compute[183403]: 2026-01-26 15:39:58.843 183407 DEBUG nova.virt.libvirt.vif [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-26T15:38:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-487098558',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-487098558',id=30,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T15:39:08Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='af8eea38f1d74ad1a01087c020ea8d02',ramdisk_id='',reservation_id='r-n9ub8z4p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-1233966703',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-1233966703-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T15:39:08Z,user_data=None,user_id='136d3cfdd6cb48e2ab65221bcc05d26c',uuid=bce6bafc-d40f-4b73-ac98-65b3105eb77f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9e5208f0-f7da-4ff3-9f0d-8fa28072d466", "address": "fa:16:3e:e4:55:f5", "network": {"id": "d9d38847-a43b-4d1e-a0b1-c3f77a879374", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1542262107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "312beda09adb420b9f44a490d7257008", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap9e5208f0-f7", "ovs_interfaceid": "9e5208f0-f7da-4ff3-9f0d-8fa28072d466", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 26 15:39:58 compute-1 nova_compute[183403]: 2026-01-26 15:39:58.844 183407 DEBUG nova.network.os_vif_util [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Converting VIF {"id": "9e5208f0-f7da-4ff3-9f0d-8fa28072d466", "address": "fa:16:3e:e4:55:f5", "network": {"id": "d9d38847-a43b-4d1e-a0b1-c3f77a879374", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1542262107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "312beda09adb420b9f44a490d7257008", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap9e5208f0-f7", "ovs_interfaceid": "9e5208f0-f7da-4ff3-9f0d-8fa28072d466", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:39:58 compute-1 nova_compute[183403]: 2026-01-26 15:39:58.844 183407 DEBUG nova.network.os_vif_util [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e4:55:f5,bridge_name='br-int',has_traffic_filtering=True,id=9e5208f0-f7da-4ff3-9f0d-8fa28072d466,network=Network(d9d38847-a43b-4d1e-a0b1-c3f77a879374),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9e5208f0-f7') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:39:58 compute-1 nova_compute[183403]: 2026-01-26 15:39:58.845 183407 DEBUG os_vif [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e4:55:f5,bridge_name='br-int',has_traffic_filtering=True,id=9e5208f0-f7da-4ff3-9f0d-8fa28072d466,network=Network(d9d38847-a43b-4d1e-a0b1-c3f77a879374),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9e5208f0-f7') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 26 15:39:58 compute-1 nova_compute[183403]: 2026-01-26 15:39:58.845 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:39:58 compute-1 nova_compute[183403]: 2026-01-26 15:39:58.845 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:39:58 compute-1 nova_compute[183403]: 2026-01-26 15:39:58.846 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:39:58 compute-1 nova_compute[183403]: 2026-01-26 15:39:58.846 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:39:58 compute-1 nova_compute[183403]: 2026-01-26 15:39:58.847 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '303a1f34-b63c-56cb-94fc-59503241686b', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:39:58 compute-1 nova_compute[183403]: 2026-01-26 15:39:58.848 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:39:58 compute-1 nova_compute[183403]: 2026-01-26 15:39:58.849 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:39:58 compute-1 nova_compute[183403]: 2026-01-26 15:39:58.853 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:39:58 compute-1 nova_compute[183403]: 2026-01-26 15:39:58.853 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9e5208f0-f7, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:39:58 compute-1 nova_compute[183403]: 2026-01-26 15:39:58.854 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap9e5208f0-f7, col_values=(('qos', UUID('54a038d6-fc50-45f5-bf06-ec6acdbbc77a')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:39:58 compute-1 nova_compute[183403]: 2026-01-26 15:39:58.854 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap9e5208f0-f7, col_values=(('external_ids', {'iface-id': '9e5208f0-f7da-4ff3-9f0d-8fa28072d466', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e4:55:f5', 'vm-uuid': 'bce6bafc-d40f-4b73-ac98-65b3105eb77f'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:39:58 compute-1 nova_compute[183403]: 2026-01-26 15:39:58.855 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:39:58 compute-1 NetworkManager[55716]: <info>  [1769441998.8561] manager: (tap9e5208f0-f7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/91)
Jan 26 15:39:58 compute-1 nova_compute[183403]: 2026-01-26 15:39:58.858 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:39:58 compute-1 nova_compute[183403]: 2026-01-26 15:39:58.863 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:39:58 compute-1 nova_compute[183403]: 2026-01-26 15:39:58.864 183407 INFO os_vif [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e4:55:f5,bridge_name='br-int',has_traffic_filtering=True,id=9e5208f0-f7da-4ff3-9f0d-8fa28072d466,network=Network(d9d38847-a43b-4d1e-a0b1-c3f77a879374),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9e5208f0-f7')
Jan 26 15:39:58 compute-1 nova_compute[183403]: 2026-01-26 15:39:58.864 183407 DEBUG nova.virt.libvirt.driver [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Jan 26 15:39:58 compute-1 nova_compute[183403]: 2026-01-26 15:39:58.865 183407 DEBUG nova.compute.manager [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp9rgzyne_',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='bce6bafc-d40f-4b73-ac98-65b3105eb77f',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9381
Jan 26 15:39:58 compute-1 nova_compute[183403]: 2026-01-26 15:39:58.865 183407 WARNING neutronclient.v2_0.client [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:39:59 compute-1 nova_compute[183403]: 2026-01-26 15:39:59.489 183407 WARNING neutronclient.v2_0.client [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:40:00 compute-1 nova_compute[183403]: 2026-01-26 15:40:00.803 183407 DEBUG nova.network.neutron [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: bce6bafc-d40f-4b73-ac98-65b3105eb77f] Port 9e5208f0-f7da-4ff3-9f0d-8fa28072d466 updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Jan 26 15:40:00 compute-1 nova_compute[183403]: 2026-01-26 15:40:00.816 183407 DEBUG nova.compute.manager [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp9rgzyne_',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='bce6bafc-d40f-4b73-ac98-65b3105eb77f',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9447
Jan 26 15:40:02 compute-1 ovn_controller[95641]: 2026-01-26T15:40:02Z|00241|memory_trim|INFO|Detected inactivity (last active 30024 ms ago): trimming memory
Jan 26 15:40:02 compute-1 nova_compute[183403]: 2026-01-26 15:40:02.225 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:40:03 compute-1 systemd[1]: Starting libvirt proxy daemon...
Jan 26 15:40:03 compute-1 systemd[1]: Started libvirt proxy daemon.
Jan 26 15:40:03 compute-1 kernel: tap9e5208f0-f7: entered promiscuous mode
Jan 26 15:40:03 compute-1 NetworkManager[55716]: <info>  [1769442003.8221] manager: (tap9e5208f0-f7): new Tun device (/org/freedesktop/NetworkManager/Devices/92)
Jan 26 15:40:03 compute-1 nova_compute[183403]: 2026-01-26 15:40:03.824 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:40:03 compute-1 ovn_controller[95641]: 2026-01-26T15:40:03Z|00242|binding|INFO|Claiming lport 9e5208f0-f7da-4ff3-9f0d-8fa28072d466 for this additional chassis.
Jan 26 15:40:03 compute-1 ovn_controller[95641]: 2026-01-26T15:40:03Z|00243|binding|INFO|9e5208f0-f7da-4ff3-9f0d-8fa28072d466: Claiming fa:16:3e:e4:55:f5 10.100.0.14
Jan 26 15:40:03 compute-1 nova_compute[183403]: 2026-01-26 15:40:03.827 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:40:03 compute-1 ovn_controller[95641]: 2026-01-26T15:40:03Z|00244|binding|INFO|Setting lport 9e5208f0-f7da-4ff3-9f0d-8fa28072d466 ovn-installed in OVS
Jan 26 15:40:03 compute-1 nova_compute[183403]: 2026-01-26 15:40:03.846 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:40:03 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:40:03.849 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e4:55:f5 10.100.0.14'], port_security=['fa:16:3e:e4:55:f5 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'bce6bafc-d40f-4b73-ac98-65b3105eb77f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d9d38847-a43b-4d1e-a0b1-c3f77a879374', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'af8eea38f1d74ad1a01087c020ea8d02', 'neutron:revision_number': '10', 'neutron:security_group_ids': '41d84080-c1ba-42a6-8417-b57b30232ea3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f2b7e03b-5278-4177-91b7-862e57a7c9ad, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=9e5208f0-f7da-4ff3-9f0d-8fa28072d466) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:40:03 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:40:03.849 104930 INFO neutron.agent.ovn.metadata.agent [-] Port 9e5208f0-f7da-4ff3-9f0d-8fa28072d466 in datapath d9d38847-a43b-4d1e-a0b1-c3f77a879374 unbound from our chassis
Jan 26 15:40:03 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:40:03.850 104930 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d9d38847-a43b-4d1e-a0b1-c3f77a879374
Jan 26 15:40:03 compute-1 systemd-udevd[215184]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 15:40:03 compute-1 nova_compute[183403]: 2026-01-26 15:40:03.855 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:40:03 compute-1 systemd-machined[154697]: New machine qemu-23-instance-0000001e.
Jan 26 15:40:03 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:40:03.864 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[e4ac299f-5efe-42ad-aef4-fe36205029e1]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:40:03 compute-1 NetworkManager[55716]: <info>  [1769442003.8673] device (tap9e5208f0-f7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 15:40:03 compute-1 NetworkManager[55716]: <info>  [1769442003.8686] device (tap9e5208f0-f7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 15:40:03 compute-1 systemd[1]: Started Virtual Machine qemu-23-instance-0000001e.
Jan 26 15:40:03 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:40:03.896 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[6430d6cc-2094-408a-85c0-099b3feb730e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:40:03 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:40:03.898 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[b410abd7-ac4b-4a75-b6d1-6a65bdd68da2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:40:03 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:40:03.931 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[a6a5ab98-06d3-4896-90d8-88e68f8e9c9e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:40:03 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:40:03.949 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[5ec3fc08-193e-4b10-acec-7e18ef55f7d9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd9d38847-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:38:11:27'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 66], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 569446, 'reachable_time': 31143, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215197, 'error': None, 'target': 'ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:40:03 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:40:03.965 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[fd90904e-9b4a-4a3f-ada4-d1450eb6f1c8]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd9d38847-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 569467, 'tstamp': 569467}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215199, 'error': None, 'target': 'ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd9d38847-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 569472, 'tstamp': 569472}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215199, 'error': None, 'target': 'ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:40:03 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:40:03.966 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd9d38847-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:40:03 compute-1 nova_compute[183403]: 2026-01-26 15:40:03.967 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:40:03 compute-1 nova_compute[183403]: 2026-01-26 15:40:03.969 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:40:03 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:40:03.969 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd9d38847-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:40:03 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:40:03.969 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:40:03 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:40:03.969 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd9d38847-a0, col_values=(('external_ids', {'iface-id': '70e6ec0a-21db-4d83-b0cb-0624424ede18'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:40:03 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:40:03.970 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:40:03 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:40:03.971 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[fca6f186-c945-485c-8547-7d92f6b39010]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-d9d38847-a43b-4d1e-a0b1-c3f77a879374\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/d9d38847-a43b-4d1e-a0b1-c3f77a879374.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID d9d38847-a43b-4d1e-a0b1-c3f77a879374\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:40:04 compute-1 sshd-session[215148]: Invalid user solana from 80.94.92.168 port 54336
Jan 26 15:40:04 compute-1 sshd-session[215148]: Connection closed by invalid user solana 80.94.92.168 port 54336 [preauth]
Jan 26 15:40:05 compute-1 podman[192725]: time="2026-01-26T15:40:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:40:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:40:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16575 "" "Go-http-client/1.1"
Jan 26 15:40:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:40:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2655 "" "Go-http-client/1.1"
Jan 26 15:40:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:40:06.392 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=36, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:ac:38', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'de:96:40:90:f3:e5'}, ipsec=False) old=SB_Global(nb_cfg=35) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:40:06 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:40:06.392 104930 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 15:40:06 compute-1 nova_compute[183403]: 2026-01-26 15:40:06.393 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:40:06 compute-1 ovn_controller[95641]: 2026-01-26T15:40:06Z|00245|binding|INFO|Claiming lport 9e5208f0-f7da-4ff3-9f0d-8fa28072d466 for this chassis.
Jan 26 15:40:06 compute-1 ovn_controller[95641]: 2026-01-26T15:40:06Z|00246|binding|INFO|9e5208f0-f7da-4ff3-9f0d-8fa28072d466: Claiming fa:16:3e:e4:55:f5 10.100.0.14
Jan 26 15:40:06 compute-1 ovn_controller[95641]: 2026-01-26T15:40:06Z|00247|binding|INFO|Setting lport 9e5208f0-f7da-4ff3-9f0d-8fa28072d466 up in Southbound
Jan 26 15:40:07 compute-1 nova_compute[183403]: 2026-01-26 15:40:07.226 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:40:07 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:40:07.394 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=41380b5a-e321-4ce4-bcc6-ecd563b3c793, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '36'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:40:07 compute-1 nova_compute[183403]: 2026-01-26 15:40:07.489 183407 INFO nova.compute.manager [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: bce6bafc-d40f-4b73-ac98-65b3105eb77f] Post operation of migration started
Jan 26 15:40:07 compute-1 nova_compute[183403]: 2026-01-26 15:40:07.490 183407 WARNING neutronclient.v2_0.client [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:40:07 compute-1 nova_compute[183403]: 2026-01-26 15:40:07.767 183407 WARNING neutronclient.v2_0.client [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:40:07 compute-1 nova_compute[183403]: 2026-01-26 15:40:07.768 183407 WARNING neutronclient.v2_0.client [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:40:07 compute-1 nova_compute[183403]: 2026-01-26 15:40:07.907 183407 DEBUG oslo_concurrency.lockutils [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "refresh_cache-bce6bafc-d40f-4b73-ac98-65b3105eb77f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 15:40:07 compute-1 nova_compute[183403]: 2026-01-26 15:40:07.907 183407 DEBUG oslo_concurrency.lockutils [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquired lock "refresh_cache-bce6bafc-d40f-4b73-ac98-65b3105eb77f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 15:40:07 compute-1 nova_compute[183403]: 2026-01-26 15:40:07.907 183407 DEBUG nova.network.neutron [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: bce6bafc-d40f-4b73-ac98-65b3105eb77f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 15:40:08 compute-1 nova_compute[183403]: 2026-01-26 15:40:08.470 183407 WARNING neutronclient.v2_0.client [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:40:08 compute-1 nova_compute[183403]: 2026-01-26 15:40:08.858 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:40:10 compute-1 nova_compute[183403]: 2026-01-26 15:40:10.493 183407 WARNING neutronclient.v2_0.client [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:40:10 compute-1 nova_compute[183403]: 2026-01-26 15:40:10.669 183407 DEBUG nova.network.neutron [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: bce6bafc-d40f-4b73-ac98-65b3105eb77f] Updating instance_info_cache with network_info: [{"id": "9e5208f0-f7da-4ff3-9f0d-8fa28072d466", "address": "fa:16:3e:e4:55:f5", "network": {"id": "d9d38847-a43b-4d1e-a0b1-c3f77a879374", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1542262107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "312beda09adb420b9f44a490d7257008", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e5208f0-f7", "ovs_interfaceid": "9e5208f0-f7da-4ff3-9f0d-8fa28072d466", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:40:11 compute-1 nova_compute[183403]: 2026-01-26 15:40:11.181 183407 DEBUG oslo_concurrency.lockutils [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Releasing lock "refresh_cache-bce6bafc-d40f-4b73-ac98-65b3105eb77f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 15:40:11 compute-1 nova_compute[183403]: 2026-01-26 15:40:11.705 183407 DEBUG oslo_concurrency.lockutils [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:40:11 compute-1 nova_compute[183403]: 2026-01-26 15:40:11.705 183407 DEBUG oslo_concurrency.lockutils [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:40:11 compute-1 nova_compute[183403]: 2026-01-26 15:40:11.705 183407 DEBUG oslo_concurrency.lockutils [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:40:11 compute-1 nova_compute[183403]: 2026-01-26 15:40:11.711 183407 INFO nova.virt.libvirt.driver [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: bce6bafc-d40f-4b73-ac98-65b3105eb77f] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Jan 26 15:40:11 compute-1 virtqemud[183290]: Domain id=23 name='instance-0000001e' uuid=bce6bafc-d40f-4b73-ac98-65b3105eb77f is tainted: custom-monitor
Jan 26 15:40:11 compute-1 podman[215223]: 2026-01-26 15:40:11.93422754 +0000 UTC m=+0.096376046 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 15:40:11 compute-1 podman[215224]: 2026-01-26 15:40:11.942281449 +0000 UTC m=+0.099622431 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 26 15:40:12 compute-1 nova_compute[183403]: 2026-01-26 15:40:12.228 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:40:12 compute-1 nova_compute[183403]: 2026-01-26 15:40:12.719 183407 INFO nova.virt.libvirt.driver [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: bce6bafc-d40f-4b73-ac98-65b3105eb77f] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Jan 26 15:40:13 compute-1 nova_compute[183403]: 2026-01-26 15:40:13.727 183407 INFO nova.virt.libvirt.driver [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: bce6bafc-d40f-4b73-ac98-65b3105eb77f] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Jan 26 15:40:13 compute-1 nova_compute[183403]: 2026-01-26 15:40:13.732 183407 DEBUG nova.compute.manager [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: bce6bafc-d40f-4b73-ac98-65b3105eb77f] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Jan 26 15:40:13 compute-1 nova_compute[183403]: 2026-01-26 15:40:13.860 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:40:14 compute-1 nova_compute[183403]: 2026-01-26 15:40:14.265 183407 DEBUG nova.objects.instance [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: bce6bafc-d40f-4b73-ac98-65b3105eb77f] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Jan 26 15:40:15 compute-1 nova_compute[183403]: 2026-01-26 15:40:15.365 183407 WARNING neutronclient.v2_0.client [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:40:15 compute-1 nova_compute[183403]: 2026-01-26 15:40:15.512 183407 WARNING neutronclient.v2_0.client [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:40:15 compute-1 nova_compute[183403]: 2026-01-26 15:40:15.513 183407 WARNING neutronclient.v2_0.client [None req-fe0bc4fa-8788-421b-9b71-869bdaa0b29a a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:40:17 compute-1 nova_compute[183403]: 2026-01-26 15:40:17.232 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:40:18 compute-1 nova_compute[183403]: 2026-01-26 15:40:18.864 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:40:19 compute-1 openstack_network_exporter[195610]: ERROR   15:40:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:40:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:40:19 compute-1 openstack_network_exporter[195610]: ERROR   15:40:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:40:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:40:22 compute-1 nova_compute[183403]: 2026-01-26 15:40:22.234 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:40:22 compute-1 nova_compute[183403]: 2026-01-26 15:40:22.266 183407 DEBUG oslo_concurrency.lockutils [None req-72661141-f4e1-4b0c-a0b6-4b241b653db3 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Acquiring lock "b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:40:22 compute-1 nova_compute[183403]: 2026-01-26 15:40:22.266 183407 DEBUG oslo_concurrency.lockutils [None req-72661141-f4e1-4b0c-a0b6-4b241b653db3 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Lock "b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:40:22 compute-1 nova_compute[183403]: 2026-01-26 15:40:22.267 183407 DEBUG oslo_concurrency.lockutils [None req-72661141-f4e1-4b0c-a0b6-4b241b653db3 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Acquiring lock "b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:40:22 compute-1 nova_compute[183403]: 2026-01-26 15:40:22.267 183407 DEBUG oslo_concurrency.lockutils [None req-72661141-f4e1-4b0c-a0b6-4b241b653db3 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Lock "b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:40:22 compute-1 nova_compute[183403]: 2026-01-26 15:40:22.267 183407 DEBUG oslo_concurrency.lockutils [None req-72661141-f4e1-4b0c-a0b6-4b241b653db3 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Lock "b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:40:22 compute-1 nova_compute[183403]: 2026-01-26 15:40:22.281 183407 INFO nova.compute.manager [None req-72661141-f4e1-4b0c-a0b6-4b241b653db3 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] Terminating instance
Jan 26 15:40:22 compute-1 nova_compute[183403]: 2026-01-26 15:40:22.798 183407 DEBUG nova.compute.manager [None req-72661141-f4e1-4b0c-a0b6-4b241b653db3 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Jan 26 15:40:22 compute-1 podman[215270]: 2026-01-26 15:40:22.934002508 +0000 UTC m=+0.093336559 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 26 15:40:22 compute-1 podman[215269]: 2026-01-26 15:40:22.979429844 +0000 UTC m=+0.143571660 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 26 15:40:23 compute-1 nova_compute[183403]: 2026-01-26 15:40:23.867 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:40:24 compute-1 kernel: tapb0d5fa76-2c (unregistering): left promiscuous mode
Jan 26 15:40:24 compute-1 NetworkManager[55716]: <info>  [1769442024.0227] device (tapb0d5fa76-2c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 15:40:24 compute-1 ovn_controller[95641]: 2026-01-26T15:40:24Z|00248|binding|INFO|Releasing lport b0d5fa76-2cd0-497f-9f26-af6ef0c5d9f4 from this chassis (sb_readonly=0)
Jan 26 15:40:24 compute-1 ovn_controller[95641]: 2026-01-26T15:40:24Z|00249|binding|INFO|Setting lport b0d5fa76-2cd0-497f-9f26-af6ef0c5d9f4 down in Southbound
Jan 26 15:40:24 compute-1 nova_compute[183403]: 2026-01-26 15:40:24.029 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:40:24 compute-1 ovn_controller[95641]: 2026-01-26T15:40:24Z|00250|binding|INFO|Removing iface tapb0d5fa76-2c ovn-installed in OVS
Jan 26 15:40:24 compute-1 nova_compute[183403]: 2026-01-26 15:40:24.031 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:40:24 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:40:24.040 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9e:81:1e 10.100.0.6'], port_security=['fa:16:3e:9e:81:1e 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d9d38847-a43b-4d1e-a0b1-c3f77a879374', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'af8eea38f1d74ad1a01087c020ea8d02', 'neutron:revision_number': '5', 'neutron:security_group_ids': '41d84080-c1ba-42a6-8417-b57b30232ea3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f2b7e03b-5278-4177-91b7-862e57a7c9ad, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], logical_port=b0d5fa76-2cd0-497f-9f26-af6ef0c5d9f4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:40:24 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:40:24.042 104930 INFO neutron.agent.ovn.metadata.agent [-] Port b0d5fa76-2cd0-497f-9f26-af6ef0c5d9f4 in datapath d9d38847-a43b-4d1e-a0b1-c3f77a879374 unbound from our chassis
Jan 26 15:40:24 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:40:24.043 104930 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d9d38847-a43b-4d1e-a0b1-c3f77a879374
Jan 26 15:40:24 compute-1 nova_compute[183403]: 2026-01-26 15:40:24.059 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:40:24 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:40:24.072 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[2330e19e-beba-4403-9c94-f0a8e7bfd589]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:40:24 compute-1 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d0000001f.scope: Deactivated successfully.
Jan 26 15:40:24 compute-1 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d0000001f.scope: Consumed 15.311s CPU time.
Jan 26 15:40:24 compute-1 systemd-machined[154697]: Machine qemu-22-instance-0000001f terminated.
Jan 26 15:40:24 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:40:24.114 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[dc970cc3-9438-444f-9d4c-851868a1561e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:40:24 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:40:24.117 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[44d75f6e-dacb-478d-adca-dbde83427fe0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:40:24 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:40:24.153 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[9588fbbc-d6f9-4645-9435-740cb7d213b9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:40:24 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:40:24.177 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[d1d882cd-8ef0-4911-81c1-fad547955a7a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd9d38847-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:38:11:27'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 66], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 569446, 'reachable_time': 31143, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215326, 'error': None, 'target': 'ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:40:24 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:40:24.198 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[5012e16d-ee3e-4fb7-b215-fe6acd1e5123]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd9d38847-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 569467, 'tstamp': 569467}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215327, 'error': None, 'target': 'ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd9d38847-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 569472, 'tstamp': 569472}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215327, 'error': None, 'target': 'ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:40:24 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:40:24.199 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd9d38847-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:40:24 compute-1 nova_compute[183403]: 2026-01-26 15:40:24.201 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:40:24 compute-1 nova_compute[183403]: 2026-01-26 15:40:24.206 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:40:24 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:40:24.207 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd9d38847-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:40:24 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:40:24.207 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:40:24 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:40:24.208 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd9d38847-a0, col_values=(('external_ids', {'iface-id': '70e6ec0a-21db-4d83-b0cb-0624424ede18'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:40:24 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:40:24.208 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:40:24 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:40:24.210 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[b06204c9-6ce5-404d-9481-1de482429278]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-d9d38847-a43b-4d1e-a0b1-c3f77a879374\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/d9d38847-a43b-4d1e-a0b1-c3f77a879374.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID d9d38847-a43b-4d1e-a0b1-c3f77a879374\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:40:24 compute-1 nova_compute[183403]: 2026-01-26 15:40:24.282 183407 INFO nova.virt.libvirt.driver [-] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] Instance destroyed successfully.
Jan 26 15:40:24 compute-1 nova_compute[183403]: 2026-01-26 15:40:24.283 183407 DEBUG nova.objects.instance [None req-72661141-f4e1-4b0c-a0b6-4b241b653db3 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Lazy-loading 'resources' on Instance uuid b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:40:24 compute-1 nova_compute[183403]: 2026-01-26 15:40:24.383 183407 DEBUG nova.compute.manager [req-0cc28cf2-ba16-486d-bd7e-61f5683a09f2 req-8a734aa4-f80a-41be-b9fd-84223f0d01ca 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] Received event network-vif-unplugged-b0d5fa76-2cd0-497f-9f26-af6ef0c5d9f4 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:40:24 compute-1 nova_compute[183403]: 2026-01-26 15:40:24.383 183407 DEBUG oslo_concurrency.lockutils [req-0cc28cf2-ba16-486d-bd7e-61f5683a09f2 req-8a734aa4-f80a-41be-b9fd-84223f0d01ca 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:40:24 compute-1 nova_compute[183403]: 2026-01-26 15:40:24.385 183407 DEBUG oslo_concurrency.lockutils [req-0cc28cf2-ba16-486d-bd7e-61f5683a09f2 req-8a734aa4-f80a-41be-b9fd-84223f0d01ca 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:40:24 compute-1 nova_compute[183403]: 2026-01-26 15:40:24.385 183407 DEBUG oslo_concurrency.lockutils [req-0cc28cf2-ba16-486d-bd7e-61f5683a09f2 req-8a734aa4-f80a-41be-b9fd-84223f0d01ca 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:40:24 compute-1 nova_compute[183403]: 2026-01-26 15:40:24.385 183407 DEBUG nova.compute.manager [req-0cc28cf2-ba16-486d-bd7e-61f5683a09f2 req-8a734aa4-f80a-41be-b9fd-84223f0d01ca 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] No waiting events found dispatching network-vif-unplugged-b0d5fa76-2cd0-497f-9f26-af6ef0c5d9f4 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:40:24 compute-1 nova_compute[183403]: 2026-01-26 15:40:24.385 183407 DEBUG nova.compute.manager [req-0cc28cf2-ba16-486d-bd7e-61f5683a09f2 req-8a734aa4-f80a-41be-b9fd-84223f0d01ca 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] Received event network-vif-unplugged-b0d5fa76-2cd0-497f-9f26-af6ef0c5d9f4 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 15:40:24 compute-1 nova_compute[183403]: 2026-01-26 15:40:24.887 183407 DEBUG nova.virt.libvirt.vif [None req-72661141-f4e1-4b0c-a0b6-4b241b653db3 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2026-01-26T15:39:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-810720634',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-810720634',id=31,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T15:39:33Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='af8eea38f1d74ad1a01087c020ea8d02',ramdisk_id='',reservation_id='r-hcbmag46',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-1233966703',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-1233966703-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T15:39:33Z,user_data=None,user_id='136d3cfdd6cb48e2ab65221bcc05d26c',uuid=b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b0d5fa76-2cd0-497f-9f26-af6ef0c5d9f4", "address": "fa:16:3e:9e:81:1e", "network": {"id": "d9d38847-a43b-4d1e-a0b1-c3f77a879374", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1542262107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "312beda09adb420b9f44a490d7257008", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0d5fa76-2c", "ovs_interfaceid": "b0d5fa76-2cd0-497f-9f26-af6ef0c5d9f4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 26 15:40:24 compute-1 nova_compute[183403]: 2026-01-26 15:40:24.888 183407 DEBUG nova.network.os_vif_util [None req-72661141-f4e1-4b0c-a0b6-4b241b653db3 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Converting VIF {"id": "b0d5fa76-2cd0-497f-9f26-af6ef0c5d9f4", "address": "fa:16:3e:9e:81:1e", "network": {"id": "d9d38847-a43b-4d1e-a0b1-c3f77a879374", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1542262107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "312beda09adb420b9f44a490d7257008", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0d5fa76-2c", "ovs_interfaceid": "b0d5fa76-2cd0-497f-9f26-af6ef0c5d9f4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:40:24 compute-1 nova_compute[183403]: 2026-01-26 15:40:24.889 183407 DEBUG nova.network.os_vif_util [None req-72661141-f4e1-4b0c-a0b6-4b241b653db3 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9e:81:1e,bridge_name='br-int',has_traffic_filtering=True,id=b0d5fa76-2cd0-497f-9f26-af6ef0c5d9f4,network=Network(d9d38847-a43b-4d1e-a0b1-c3f77a879374),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb0d5fa76-2c') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:40:24 compute-1 nova_compute[183403]: 2026-01-26 15:40:24.890 183407 DEBUG os_vif [None req-72661141-f4e1-4b0c-a0b6-4b241b653db3 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9e:81:1e,bridge_name='br-int',has_traffic_filtering=True,id=b0d5fa76-2cd0-497f-9f26-af6ef0c5d9f4,network=Network(d9d38847-a43b-4d1e-a0b1-c3f77a879374),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb0d5fa76-2c') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 26 15:40:24 compute-1 nova_compute[183403]: 2026-01-26 15:40:24.894 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:40:24 compute-1 nova_compute[183403]: 2026-01-26 15:40:24.895 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb0d5fa76-2c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:40:24 compute-1 nova_compute[183403]: 2026-01-26 15:40:24.897 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:40:24 compute-1 nova_compute[183403]: 2026-01-26 15:40:24.899 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:40:24 compute-1 nova_compute[183403]: 2026-01-26 15:40:24.900 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:40:24 compute-1 nova_compute[183403]: 2026-01-26 15:40:24.901 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=4f12d460-57e5-4c96-9a1b-880c43edee4d) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:40:24 compute-1 nova_compute[183403]: 2026-01-26 15:40:24.903 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:40:24 compute-1 nova_compute[183403]: 2026-01-26 15:40:24.905 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:40:24 compute-1 nova_compute[183403]: 2026-01-26 15:40:24.907 183407 INFO os_vif [None req-72661141-f4e1-4b0c-a0b6-4b241b653db3 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9e:81:1e,bridge_name='br-int',has_traffic_filtering=True,id=b0d5fa76-2cd0-497f-9f26-af6ef0c5d9f4,network=Network(d9d38847-a43b-4d1e-a0b1-c3f77a879374),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb0d5fa76-2c')
Jan 26 15:40:24 compute-1 nova_compute[183403]: 2026-01-26 15:40:24.908 183407 INFO nova.virt.libvirt.driver [None req-72661141-f4e1-4b0c-a0b6-4b241b653db3 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] Deleting instance files /var/lib/nova/instances/b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab_del
Jan 26 15:40:24 compute-1 nova_compute[183403]: 2026-01-26 15:40:24.909 183407 INFO nova.virt.libvirt.driver [None req-72661141-f4e1-4b0c-a0b6-4b241b653db3 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] Deletion of /var/lib/nova/instances/b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab_del complete
Jan 26 15:40:25 compute-1 nova_compute[183403]: 2026-01-26 15:40:25.614 183407 INFO nova.compute.manager [None req-72661141-f4e1-4b0c-a0b6-4b241b653db3 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] Took 2.81 seconds to destroy the instance on the hypervisor.
Jan 26 15:40:25 compute-1 nova_compute[183403]: 2026-01-26 15:40:25.615 183407 DEBUG oslo.service.backend._eventlet.loopingcall [None req-72661141-f4e1-4b0c-a0b6-4b241b653db3 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Jan 26 15:40:25 compute-1 nova_compute[183403]: 2026-01-26 15:40:25.615 183407 DEBUG nova.compute.manager [-] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Jan 26 15:40:25 compute-1 nova_compute[183403]: 2026-01-26 15:40:25.615 183407 DEBUG nova.network.neutron [-] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Jan 26 15:40:25 compute-1 nova_compute[183403]: 2026-01-26 15:40:25.616 183407 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:40:26 compute-1 nova_compute[183403]: 2026-01-26 15:40:26.437 183407 DEBUG nova.compute.manager [req-5a55929a-2fed-45e7-b1ae-babcf549e7fb req-c75533c9-7976-4030-9f48-974fb6b8b6ad 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] Received event network-vif-unplugged-b0d5fa76-2cd0-497f-9f26-af6ef0c5d9f4 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:40:26 compute-1 nova_compute[183403]: 2026-01-26 15:40:26.437 183407 DEBUG oslo_concurrency.lockutils [req-5a55929a-2fed-45e7-b1ae-babcf549e7fb req-c75533c9-7976-4030-9f48-974fb6b8b6ad 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:40:26 compute-1 nova_compute[183403]: 2026-01-26 15:40:26.438 183407 DEBUG oslo_concurrency.lockutils [req-5a55929a-2fed-45e7-b1ae-babcf549e7fb req-c75533c9-7976-4030-9f48-974fb6b8b6ad 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:40:26 compute-1 nova_compute[183403]: 2026-01-26 15:40:26.438 183407 DEBUG oslo_concurrency.lockutils [req-5a55929a-2fed-45e7-b1ae-babcf549e7fb req-c75533c9-7976-4030-9f48-974fb6b8b6ad 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:40:26 compute-1 nova_compute[183403]: 2026-01-26 15:40:26.438 183407 DEBUG nova.compute.manager [req-5a55929a-2fed-45e7-b1ae-babcf549e7fb req-c75533c9-7976-4030-9f48-974fb6b8b6ad 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] No waiting events found dispatching network-vif-unplugged-b0d5fa76-2cd0-497f-9f26-af6ef0c5d9f4 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:40:26 compute-1 nova_compute[183403]: 2026-01-26 15:40:26.438 183407 DEBUG nova.compute.manager [req-5a55929a-2fed-45e7-b1ae-babcf549e7fb req-c75533c9-7976-4030-9f48-974fb6b8b6ad 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] Received event network-vif-unplugged-b0d5fa76-2cd0-497f-9f26-af6ef0c5d9f4 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 15:40:26 compute-1 nova_compute[183403]: 2026-01-26 15:40:26.518 183407 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:40:26 compute-1 nova_compute[183403]: 2026-01-26 15:40:26.577 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:40:27 compute-1 nova_compute[183403]: 2026-01-26 15:40:27.237 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:40:27 compute-1 nova_compute[183403]: 2026-01-26 15:40:27.282 183407 DEBUG nova.network.neutron [-] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:40:27 compute-1 nova_compute[183403]: 2026-01-26 15:40:27.819 183407 INFO nova.compute.manager [-] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] Took 2.20 seconds to deallocate network for instance.
Jan 26 15:40:28 compute-1 nova_compute[183403]: 2026-01-26 15:40:28.346 183407 DEBUG oslo_concurrency.lockutils [None req-72661141-f4e1-4b0c-a0b6-4b241b653db3 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:40:28 compute-1 nova_compute[183403]: 2026-01-26 15:40:28.346 183407 DEBUG oslo_concurrency.lockutils [None req-72661141-f4e1-4b0c-a0b6-4b241b653db3 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:40:28 compute-1 nova_compute[183403]: 2026-01-26 15:40:28.417 183407 DEBUG nova.compute.provider_tree [None req-72661141-f4e1-4b0c-a0b6-4b241b653db3 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:40:28 compute-1 nova_compute[183403]: 2026-01-26 15:40:28.498 183407 DEBUG nova.compute.manager [req-128649f8-9818-463f-a07f-10d0c41165d7 req-ce4d7dd5-f6f3-44de-adc2-c320cd1c7e5f 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab] Received event network-vif-deleted-b0d5fa76-2cd0-497f-9f26-af6ef0c5d9f4 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:40:28 compute-1 nova_compute[183403]: 2026-01-26 15:40:28.926 183407 DEBUG nova.scheduler.client.report [None req-72661141-f4e1-4b0c-a0b6-4b241b653db3 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:40:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:40:29.099 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:40:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:40:29.099 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:40:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:40:29.100 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:40:29 compute-1 nova_compute[183403]: 2026-01-26 15:40:29.440 183407 DEBUG oslo_concurrency.lockutils [None req-72661141-f4e1-4b0c-a0b6-4b241b653db3 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.094s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:40:29 compute-1 nova_compute[183403]: 2026-01-26 15:40:29.473 183407 INFO nova.scheduler.client.report [None req-72661141-f4e1-4b0c-a0b6-4b241b653db3 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Deleted allocations for instance b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab
Jan 26 15:40:29 compute-1 nova_compute[183403]: 2026-01-26 15:40:29.903 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:40:30 compute-1 nova_compute[183403]: 2026-01-26 15:40:30.504 183407 DEBUG oslo_concurrency.lockutils [None req-72661141-f4e1-4b0c-a0b6-4b241b653db3 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Lock "b7a1e6b3-9b3d-4ff6-8a3c-1484c1ecbcab" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 8.237s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:40:30 compute-1 nova_compute[183403]: 2026-01-26 15:40:30.575 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:40:31 compute-1 nova_compute[183403]: 2026-01-26 15:40:31.203 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:40:31 compute-1 nova_compute[183403]: 2026-01-26 15:40:31.204 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:40:31 compute-1 nova_compute[183403]: 2026-01-26 15:40:31.204 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:40:31 compute-1 nova_compute[183403]: 2026-01-26 15:40:31.205 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:40:31 compute-1 nova_compute[183403]: 2026-01-26 15:40:31.492 183407 DEBUG oslo_concurrency.lockutils [None req-bdd84267-3fce-424a-bbe6-6f5204d5048f 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Acquiring lock "bce6bafc-d40f-4b73-ac98-65b3105eb77f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:40:31 compute-1 nova_compute[183403]: 2026-01-26 15:40:31.493 183407 DEBUG oslo_concurrency.lockutils [None req-bdd84267-3fce-424a-bbe6-6f5204d5048f 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Lock "bce6bafc-d40f-4b73-ac98-65b3105eb77f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:40:31 compute-1 nova_compute[183403]: 2026-01-26 15:40:31.493 183407 DEBUG oslo_concurrency.lockutils [None req-bdd84267-3fce-424a-bbe6-6f5204d5048f 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Acquiring lock "bce6bafc-d40f-4b73-ac98-65b3105eb77f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:40:31 compute-1 nova_compute[183403]: 2026-01-26 15:40:31.493 183407 DEBUG oslo_concurrency.lockutils [None req-bdd84267-3fce-424a-bbe6-6f5204d5048f 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Lock "bce6bafc-d40f-4b73-ac98-65b3105eb77f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:40:31 compute-1 nova_compute[183403]: 2026-01-26 15:40:31.493 183407 DEBUG oslo_concurrency.lockutils [None req-bdd84267-3fce-424a-bbe6-6f5204d5048f 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Lock "bce6bafc-d40f-4b73-ac98-65b3105eb77f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:40:31 compute-1 nova_compute[183403]: 2026-01-26 15:40:31.509 183407 INFO nova.compute.manager [None req-bdd84267-3fce-424a-bbe6-6f5204d5048f 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: bce6bafc-d40f-4b73-ac98-65b3105eb77f] Terminating instance
Jan 26 15:40:32 compute-1 nova_compute[183403]: 2026-01-26 15:40:32.027 183407 DEBUG nova.compute.manager [None req-bdd84267-3fce-424a-bbe6-6f5204d5048f 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: bce6bafc-d40f-4b73-ac98-65b3105eb77f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Jan 26 15:40:32 compute-1 kernel: tap9e5208f0-f7 (unregistering): left promiscuous mode
Jan 26 15:40:32 compute-1 NetworkManager[55716]: <info>  [1769442032.1691] device (tap9e5208f0-f7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 15:40:32 compute-1 ovn_controller[95641]: 2026-01-26T15:40:32Z|00251|binding|INFO|Releasing lport 9e5208f0-f7da-4ff3-9f0d-8fa28072d466 from this chassis (sb_readonly=0)
Jan 26 15:40:32 compute-1 ovn_controller[95641]: 2026-01-26T15:40:32Z|00252|binding|INFO|Setting lport 9e5208f0-f7da-4ff3-9f0d-8fa28072d466 down in Southbound
Jan 26 15:40:32 compute-1 ovn_controller[95641]: 2026-01-26T15:40:32Z|00253|binding|INFO|Removing iface tap9e5208f0-f7 ovn-installed in OVS
Jan 26 15:40:32 compute-1 nova_compute[183403]: 2026-01-26 15:40:32.177 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:40:32 compute-1 nova_compute[183403]: 2026-01-26 15:40:32.195 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:40:32 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:40:32.203 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e4:55:f5 10.100.0.14'], port_security=['fa:16:3e:e4:55:f5 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'bce6bafc-d40f-4b73-ac98-65b3105eb77f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d9d38847-a43b-4d1e-a0b1-c3f77a879374', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'af8eea38f1d74ad1a01087c020ea8d02', 'neutron:revision_number': '16', 'neutron:security_group_ids': '41d84080-c1ba-42a6-8417-b57b30232ea3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f2b7e03b-5278-4177-91b7-862e57a7c9ad, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], logical_port=9e5208f0-f7da-4ff3-9f0d-8fa28072d466) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:40:32 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:40:32.204 104930 INFO neutron.agent.ovn.metadata.agent [-] Port 9e5208f0-f7da-4ff3-9f0d-8fa28072d466 in datapath d9d38847-a43b-4d1e-a0b1-c3f77a879374 unbound from our chassis
Jan 26 15:40:32 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:40:32.205 104930 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d9d38847-a43b-4d1e-a0b1-c3f77a879374, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 15:40:32 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:40:32.206 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[8e156d42-256d-4fab-8775-e72c1704d5f8]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:40:32 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:40:32.207 104930 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374 namespace which is not needed anymore
Jan 26 15:40:32 compute-1 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d0000001e.scope: Deactivated successfully.
Jan 26 15:40:32 compute-1 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d0000001e.scope: Consumed 2.530s CPU time.
Jan 26 15:40:32 compute-1 systemd-machined[154697]: Machine qemu-23-instance-0000001e terminated.
Jan 26 15:40:32 compute-1 nova_compute[183403]: 2026-01-26 15:40:32.238 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:40:32 compute-1 neutron-haproxy-ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374[215011]: [NOTICE]   (215015) : haproxy version is 3.0.5-8e879a5
Jan 26 15:40:32 compute-1 neutron-haproxy-ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374[215011]: [NOTICE]   (215015) : path to executable is /usr/sbin/haproxy
Jan 26 15:40:32 compute-1 neutron-haproxy-ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374[215011]: [WARNING]  (215015) : Exiting Master process...
Jan 26 15:40:32 compute-1 neutron-haproxy-ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374[215011]: [ALERT]    (215015) : Current worker (215017) exited with code 143 (Terminated)
Jan 26 15:40:32 compute-1 neutron-haproxy-ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374[215011]: [WARNING]  (215015) : All workers exited. Exiting... (0)
Jan 26 15:40:32 compute-1 podman[215374]: 2026-01-26 15:40:32.344589443 +0000 UTC m=+0.037058792 container kill 286f6d0bf3b2b7c9acc678871946577d1a8e4bccb012bd62f5123383d6e12dd3 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260120)
Jan 26 15:40:32 compute-1 systemd[1]: libpod-286f6d0bf3b2b7c9acc678871946577d1a8e4bccb012bd62f5123383d6e12dd3.scope: Deactivated successfully.
Jan 26 15:40:32 compute-1 nova_compute[183403]: 2026-01-26 15:40:32.489 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:40:32 compute-1 nova_compute[183403]: 2026-01-26 15:40:32.547 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bce6bafc-d40f-4b73-ac98-65b3105eb77f/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:40:32 compute-1 nova_compute[183403]: 2026-01-26 15:40:32.558 183407 INFO nova.virt.libvirt.driver [-] [instance: bce6bafc-d40f-4b73-ac98-65b3105eb77f] Instance destroyed successfully.
Jan 26 15:40:32 compute-1 nova_compute[183403]: 2026-01-26 15:40:32.560 183407 DEBUG nova.objects.instance [None req-bdd84267-3fce-424a-bbe6-6f5204d5048f 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Lazy-loading 'resources' on Instance uuid bce6bafc-d40f-4b73-ac98-65b3105eb77f obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:40:32 compute-1 nova_compute[183403]: 2026-01-26 15:40:32.623 183407 DEBUG nova.compute.manager [req-da8e77ca-5e40-4657-b3a4-b8a103b13749 req-4e17b0b1-938a-420b-9a62-3db67f46657c 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: bce6bafc-d40f-4b73-ac98-65b3105eb77f] Received event network-vif-unplugged-9e5208f0-f7da-4ff3-9f0d-8fa28072d466 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:40:32 compute-1 nova_compute[183403]: 2026-01-26 15:40:32.624 183407 DEBUG oslo_concurrency.lockutils [req-da8e77ca-5e40-4657-b3a4-b8a103b13749 req-4e17b0b1-938a-420b-9a62-3db67f46657c 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "bce6bafc-d40f-4b73-ac98-65b3105eb77f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:40:32 compute-1 nova_compute[183403]: 2026-01-26 15:40:32.624 183407 DEBUG oslo_concurrency.lockutils [req-da8e77ca-5e40-4657-b3a4-b8a103b13749 req-4e17b0b1-938a-420b-9a62-3db67f46657c 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "bce6bafc-d40f-4b73-ac98-65b3105eb77f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:40:32 compute-1 nova_compute[183403]: 2026-01-26 15:40:32.625 183407 DEBUG oslo_concurrency.lockutils [req-da8e77ca-5e40-4657-b3a4-b8a103b13749 req-4e17b0b1-938a-420b-9a62-3db67f46657c 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "bce6bafc-d40f-4b73-ac98-65b3105eb77f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:40:32 compute-1 nova_compute[183403]: 2026-01-26 15:40:32.625 183407 DEBUG nova.compute.manager [req-da8e77ca-5e40-4657-b3a4-b8a103b13749 req-4e17b0b1-938a-420b-9a62-3db67f46657c 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: bce6bafc-d40f-4b73-ac98-65b3105eb77f] No waiting events found dispatching network-vif-unplugged-9e5208f0-f7da-4ff3-9f0d-8fa28072d466 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:40:32 compute-1 nova_compute[183403]: 2026-01-26 15:40:32.625 183407 DEBUG nova.compute.manager [req-da8e77ca-5e40-4657-b3a4-b8a103b13749 req-4e17b0b1-938a-420b-9a62-3db67f46657c 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: bce6bafc-d40f-4b73-ac98-65b3105eb77f] Received event network-vif-unplugged-9e5208f0-f7da-4ff3-9f0d-8fa28072d466 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 15:40:32 compute-1 nova_compute[183403]: 2026-01-26 15:40:32.679 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bce6bafc-d40f-4b73-ac98-65b3105eb77f/disk --force-share --output=json" returned: 0 in 0.131s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:40:32 compute-1 nova_compute[183403]: 2026-01-26 15:40:32.680 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bce6bafc-d40f-4b73-ac98-65b3105eb77f/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:40:32 compute-1 nova_compute[183403]: 2026-01-26 15:40:32.767 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bce6bafc-d40f-4b73-ac98-65b3105eb77f/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:40:32 compute-1 nova_compute[183403]: 2026-01-26 15:40:32.944 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:40:32 compute-1 nova_compute[183403]: 2026-01-26 15:40:32.945 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:40:32 compute-1 nova_compute[183403]: 2026-01-26 15:40:32.970 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.025s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:40:32 compute-1 nova_compute[183403]: 2026-01-26 15:40:32.970 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5665MB free_disk=73.11560440063477GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:40:32 compute-1 nova_compute[183403]: 2026-01-26 15:40:32.971 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:40:32 compute-1 nova_compute[183403]: 2026-01-26 15:40:32.971 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:40:33 compute-1 nova_compute[183403]: 2026-01-26 15:40:33.068 183407 DEBUG nova.virt.libvirt.vif [None req-bdd84267-3fce-424a-bbe6-6f5204d5048f 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2026-01-26T15:38:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-487098558',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-487098558',id=30,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T15:39:08Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='af8eea38f1d74ad1a01087c020ea8d02',ramdisk_id='',reservation_id='r-n9ub8z4p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',clean_attempts='1',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-1233966703',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-1233966703-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T15:40:14Z,user_data=None,user_id='136d3cfdd6cb48e2ab65221bcc05d26c',uuid=bce6bafc-d40f-4b73-ac98-65b3105eb77f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9e5208f0-f7da-4ff3-9f0d-8fa28072d466", "address": "fa:16:3e:e4:55:f5", "network": {"id": "d9d38847-a43b-4d1e-a0b1-c3f77a879374", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1542262107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "312beda09adb420b9f44a490d7257008", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e5208f0-f7", "ovs_interfaceid": "9e5208f0-f7da-4ff3-9f0d-8fa28072d466", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 26 15:40:33 compute-1 nova_compute[183403]: 2026-01-26 15:40:33.069 183407 DEBUG nova.network.os_vif_util [None req-bdd84267-3fce-424a-bbe6-6f5204d5048f 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Converting VIF {"id": "9e5208f0-f7da-4ff3-9f0d-8fa28072d466", "address": "fa:16:3e:e4:55:f5", "network": {"id": "d9d38847-a43b-4d1e-a0b1-c3f77a879374", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1542262107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "312beda09adb420b9f44a490d7257008", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e5208f0-f7", "ovs_interfaceid": "9e5208f0-f7da-4ff3-9f0d-8fa28072d466", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:40:33 compute-1 nova_compute[183403]: 2026-01-26 15:40:33.070 183407 DEBUG nova.network.os_vif_util [None req-bdd84267-3fce-424a-bbe6-6f5204d5048f 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e4:55:f5,bridge_name='br-int',has_traffic_filtering=True,id=9e5208f0-f7da-4ff3-9f0d-8fa28072d466,network=Network(d9d38847-a43b-4d1e-a0b1-c3f77a879374),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9e5208f0-f7') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:40:33 compute-1 nova_compute[183403]: 2026-01-26 15:40:33.070 183407 DEBUG os_vif [None req-bdd84267-3fce-424a-bbe6-6f5204d5048f 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e4:55:f5,bridge_name='br-int',has_traffic_filtering=True,id=9e5208f0-f7da-4ff3-9f0d-8fa28072d466,network=Network(d9d38847-a43b-4d1e-a0b1-c3f77a879374),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9e5208f0-f7') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 26 15:40:33 compute-1 nova_compute[183403]: 2026-01-26 15:40:33.072 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:40:33 compute-1 nova_compute[183403]: 2026-01-26 15:40:33.072 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9e5208f0-f7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:40:33 compute-1 nova_compute[183403]: 2026-01-26 15:40:33.074 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:40:33 compute-1 nova_compute[183403]: 2026-01-26 15:40:33.075 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:40:33 compute-1 nova_compute[183403]: 2026-01-26 15:40:33.076 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:40:33 compute-1 nova_compute[183403]: 2026-01-26 15:40:33.077 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=54a038d6-fc50-45f5-bf06-ec6acdbbc77a) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:40:33 compute-1 nova_compute[183403]: 2026-01-26 15:40:33.078 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:40:33 compute-1 nova_compute[183403]: 2026-01-26 15:40:33.080 183407 INFO os_vif [None req-bdd84267-3fce-424a-bbe6-6f5204d5048f 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e4:55:f5,bridge_name='br-int',has_traffic_filtering=True,id=9e5208f0-f7da-4ff3-9f0d-8fa28072d466,network=Network(d9d38847-a43b-4d1e-a0b1-c3f77a879374),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9e5208f0-f7')
Jan 26 15:40:33 compute-1 nova_compute[183403]: 2026-01-26 15:40:33.080 183407 INFO nova.virt.libvirt.driver [None req-bdd84267-3fce-424a-bbe6-6f5204d5048f 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: bce6bafc-d40f-4b73-ac98-65b3105eb77f] Deleting instance files /var/lib/nova/instances/bce6bafc-d40f-4b73-ac98-65b3105eb77f_del
Jan 26 15:40:33 compute-1 nova_compute[183403]: 2026-01-26 15:40:33.081 183407 INFO nova.virt.libvirt.driver [None req-bdd84267-3fce-424a-bbe6-6f5204d5048f 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: bce6bafc-d40f-4b73-ac98-65b3105eb77f] Deletion of /var/lib/nova/instances/bce6bafc-d40f-4b73-ac98-65b3105eb77f_del complete
Jan 26 15:40:33 compute-1 podman[215425]: 2026-01-26 15:40:33.33935455 +0000 UTC m=+0.035883560 container died 286f6d0bf3b2b7c9acc678871946577d1a8e4bccb012bd62f5123383d6e12dd3 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 26 15:40:33 compute-1 nova_compute[183403]: 2026-01-26 15:40:33.593 183407 INFO nova.compute.manager [None req-bdd84267-3fce-424a-bbe6-6f5204d5048f 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: bce6bafc-d40f-4b73-ac98-65b3105eb77f] Took 1.56 seconds to destroy the instance on the hypervisor.
Jan 26 15:40:33 compute-1 nova_compute[183403]: 2026-01-26 15:40:33.593 183407 DEBUG oslo.service.backend._eventlet.loopingcall [None req-bdd84267-3fce-424a-bbe6-6f5204d5048f 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Jan 26 15:40:33 compute-1 nova_compute[183403]: 2026-01-26 15:40:33.593 183407 DEBUG nova.compute.manager [-] [instance: bce6bafc-d40f-4b73-ac98-65b3105eb77f] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Jan 26 15:40:33 compute-1 nova_compute[183403]: 2026-01-26 15:40:33.594 183407 DEBUG nova.network.neutron [-] [instance: bce6bafc-d40f-4b73-ac98-65b3105eb77f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Jan 26 15:40:33 compute-1 nova_compute[183403]: 2026-01-26 15:40:33.594 183407 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:40:34 compute-1 nova_compute[183403]: 2026-01-26 15:40:34.526 183407 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:40:34 compute-1 nova_compute[183403]: 2026-01-26 15:40:34.534 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Instance bce6bafc-d40f-4b73-ac98-65b3105eb77f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 15:40:34 compute-1 nova_compute[183403]: 2026-01-26 15:40:34.535 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:40:34 compute-1 nova_compute[183403]: 2026-01-26 15:40:34.535 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:40:32 up  1:35,  0 user,  load average: 0.27, 0.17, 0.19\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_deleting': '1', 'num_os_type_None': '1', 'num_proj_af8eea38f1d74ad1a01087c020ea8d02': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:40:34 compute-1 nova_compute[183403]: 2026-01-26 15:40:34.582 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:40:34 compute-1 nova_compute[183403]: 2026-01-26 15:40:34.746 183407 DEBUG nova.compute.manager [req-841da68f-8332-48ce-886f-c201f6e3f4da req-a3804062-9678-44fe-9ea4-4bc4bbddbc7c 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: bce6bafc-d40f-4b73-ac98-65b3105eb77f] Received event network-vif-unplugged-9e5208f0-f7da-4ff3-9f0d-8fa28072d466 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:40:34 compute-1 nova_compute[183403]: 2026-01-26 15:40:34.747 183407 DEBUG oslo_concurrency.lockutils [req-841da68f-8332-48ce-886f-c201f6e3f4da req-a3804062-9678-44fe-9ea4-4bc4bbddbc7c 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "bce6bafc-d40f-4b73-ac98-65b3105eb77f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:40:34 compute-1 nova_compute[183403]: 2026-01-26 15:40:34.747 183407 DEBUG oslo_concurrency.lockutils [req-841da68f-8332-48ce-886f-c201f6e3f4da req-a3804062-9678-44fe-9ea4-4bc4bbddbc7c 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "bce6bafc-d40f-4b73-ac98-65b3105eb77f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:40:34 compute-1 nova_compute[183403]: 2026-01-26 15:40:34.748 183407 DEBUG oslo_concurrency.lockutils [req-841da68f-8332-48ce-886f-c201f6e3f4da req-a3804062-9678-44fe-9ea4-4bc4bbddbc7c 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "bce6bafc-d40f-4b73-ac98-65b3105eb77f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:40:34 compute-1 nova_compute[183403]: 2026-01-26 15:40:34.750 183407 DEBUG nova.compute.manager [req-841da68f-8332-48ce-886f-c201f6e3f4da req-a3804062-9678-44fe-9ea4-4bc4bbddbc7c 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: bce6bafc-d40f-4b73-ac98-65b3105eb77f] No waiting events found dispatching network-vif-unplugged-9e5208f0-f7da-4ff3-9f0d-8fa28072d466 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:40:34 compute-1 nova_compute[183403]: 2026-01-26 15:40:34.750 183407 DEBUG nova.compute.manager [req-841da68f-8332-48ce-886f-c201f6e3f4da req-a3804062-9678-44fe-9ea4-4bc4bbddbc7c 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: bce6bafc-d40f-4b73-ac98-65b3105eb77f] Received event network-vif-unplugged-9e5208f0-f7da-4ff3-9f0d-8fa28072d466 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 15:40:35 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-286f6d0bf3b2b7c9acc678871946577d1a8e4bccb012bd62f5123383d6e12dd3-userdata-shm.mount: Deactivated successfully.
Jan 26 15:40:35 compute-1 systemd[1]: var-lib-containers-storage-overlay-7ff6f30108b7d1787f5ecdcf0b399b7b3b1609a681826d9faeb46638ec70da0a-merged.mount: Deactivated successfully.
Jan 26 15:40:35 compute-1 nova_compute[183403]: 2026-01-26 15:40:35.090 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:40:35 compute-1 nova_compute[183403]: 2026-01-26 15:40:35.607 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:40:35 compute-1 nova_compute[183403]: 2026-01-26 15:40:35.608 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.636s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:40:35 compute-1 podman[192725]: time="2026-01-26T15:40:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:40:35 compute-1 nova_compute[183403]: 2026-01-26 15:40:35.962 183407 DEBUG nova.network.neutron [-] [instance: bce6bafc-d40f-4b73-ac98-65b3105eb77f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:40:36 compute-1 nova_compute[183403]: 2026-01-26 15:40:36.470 183407 INFO nova.compute.manager [-] [instance: bce6bafc-d40f-4b73-ac98-65b3105eb77f] Took 2.88 seconds to deallocate network for instance.
Jan 26 15:40:36 compute-1 nova_compute[183403]: 2026-01-26 15:40:36.780 183407 DEBUG nova.compute.manager [req-6928b307-9049-44d9-bf30-fd1a93d2fb29 req-f1a2f8e2-fa88-4023-a28f-836e1890996b 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: bce6bafc-d40f-4b73-ac98-65b3105eb77f] Received event network-vif-deleted-9e5208f0-f7da-4ff3-9f0d-8fa28072d466 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:40:37 compute-1 nova_compute[183403]: 2026-01-26 15:40:37.002 183407 DEBUG oslo_concurrency.lockutils [None req-bdd84267-3fce-424a-bbe6-6f5204d5048f 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:40:37 compute-1 nova_compute[183403]: 2026-01-26 15:40:37.003 183407 DEBUG oslo_concurrency.lockutils [None req-bdd84267-3fce-424a-bbe6-6f5204d5048f 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:40:37 compute-1 nova_compute[183403]: 2026-01-26 15:40:37.053 183407 DEBUG nova.compute.provider_tree [None req-bdd84267-3fce-424a-bbe6-6f5204d5048f 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:40:37 compute-1 nova_compute[183403]: 2026-01-26 15:40:37.242 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:40:37 compute-1 nova_compute[183403]: 2026-01-26 15:40:37.719 183407 DEBUG nova.scheduler.client.report [None req-bdd84267-3fce-424a-bbe6-6f5204d5048f 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:40:38 compute-1 nova_compute[183403]: 2026-01-26 15:40:38.115 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:40:38 compute-1 nova_compute[183403]: 2026-01-26 15:40:38.235 183407 DEBUG oslo_concurrency.lockutils [None req-bdd84267-3fce-424a-bbe6-6f5204d5048f 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.232s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:40:38 compute-1 nova_compute[183403]: 2026-01-26 15:40:38.268 183407 INFO nova.scheduler.client.report [None req-bdd84267-3fce-424a-bbe6-6f5204d5048f 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Deleted allocations for instance bce6bafc-d40f-4b73-ac98-65b3105eb77f
Jan 26 15:40:38 compute-1 podman[215425]: 2026-01-26 15:40:38.463943695 +0000 UTC m=+5.160472705 container remove 286f6d0bf3b2b7c9acc678871946577d1a8e4bccb012bd62f5123383d6e12dd3 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 15:40:38 compute-1 podman[192725]: @ - - [26/Jan/2026:15:40:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16568 "" "Go-http-client/1.1"
Jan 26 15:40:38 compute-1 podman[192725]: @ - - [26/Jan/2026:15:40:38 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2195 "" "Go-http-client/1.1"
Jan 26 15:40:38 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:40:38.479 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[657180a5-aa56-426e-b195-6dce02e7d681]: (4, ("Mon Jan 26 03:40:32 PM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374 (286f6d0bf3b2b7c9acc678871946577d1a8e4bccb012bd62f5123383d6e12dd3)\n286f6d0bf3b2b7c9acc678871946577d1a8e4bccb012bd62f5123383d6e12dd3\nMon Jan 26 03:40:33 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374 (286f6d0bf3b2b7c9acc678871946577d1a8e4bccb012bd62f5123383d6e12dd3)\n286f6d0bf3b2b7c9acc678871946577d1a8e4bccb012bd62f5123383d6e12dd3\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:40:38 compute-1 systemd[1]: libpod-conmon-286f6d0bf3b2b7c9acc678871946577d1a8e4bccb012bd62f5123383d6e12dd3.scope: Deactivated successfully.
Jan 26 15:40:38 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:40:38.481 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[3f62b82c-95b6-4de6-93b9-8f54c95565a7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:40:38 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:40:38.482 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d9d38847-a43b-4d1e-a0b1-c3f77a879374.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d9d38847-a43b-4d1e-a0b1-c3f77a879374.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:40:38 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:40:38.483 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[750d21e9-1d9b-46df-9f76-c7785d1fa64d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:40:38 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:40:38.484 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd9d38847-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:40:38 compute-1 nova_compute[183403]: 2026-01-26 15:40:38.486 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:40:38 compute-1 kernel: tapd9d38847-a0: left promiscuous mode
Jan 26 15:40:38 compute-1 nova_compute[183403]: 2026-01-26 15:40:38.489 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:40:38 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:40:38.493 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[9ab8f89f-0295-4d70-b300-f9d40cde4a87]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:40:38 compute-1 nova_compute[183403]: 2026-01-26 15:40:38.504 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:40:38 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:40:38.511 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[05a6e442-259a-4747-820f-d9df65b98c51]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:40:38 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:40:38.513 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[611ec71a-a9a4-49fc-b054-23608982326a]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:40:38 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:40:38.533 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[f4d99383-37d9-4ee6-92a6-bf8c2398abbd]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 569438, 'reachable_time': 41870, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215442, 'error': None, 'target': 'ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:40:38 compute-1 systemd[1]: run-netns-ovnmeta\x2dd9d38847\x2da43b\x2d4d1e\x2da0b1\x2dc3f77a879374.mount: Deactivated successfully.
Jan 26 15:40:38 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:40:38.538 105448 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Jan 26 15:40:38 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:40:38.538 105448 DEBUG oslo.privsep.daemon [-] privsep: reply[dfabd2e0-175a-4df5-9da2-dd68f3d5995b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:40:38 compute-1 nova_compute[183403]: 2026-01-26 15:40:38.608 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:40:38 compute-1 nova_compute[183403]: 2026-01-26 15:40:38.609 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:40:38 compute-1 nova_compute[183403]: 2026-01-26 15:40:38.609 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:40:38 compute-1 nova_compute[183403]: 2026-01-26 15:40:38.609 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:40:38 compute-1 nova_compute[183403]: 2026-01-26 15:40:38.610 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:40:38 compute-1 nova_compute[183403]: 2026-01-26 15:40:38.610 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:40:38 compute-1 nova_compute[183403]: 2026-01-26 15:40:38.610 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 15:40:39 compute-1 nova_compute[183403]: 2026-01-26 15:40:39.301 183407 DEBUG oslo_concurrency.lockutils [None req-bdd84267-3fce-424a-bbe6-6f5204d5048f 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Lock "bce6bafc-d40f-4b73-ac98-65b3105eb77f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.809s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:40:42 compute-1 nova_compute[183403]: 2026-01-26 15:40:42.244 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:40:42 compute-1 podman[215447]: 2026-01-26 15:40:42.896593566 +0000 UTC m=+0.067920590 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 26 15:40:42 compute-1 podman[215448]: 2026-01-26 15:40:42.925625558 +0000 UTC m=+0.083737620 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, distribution-scope=public, vcs-type=git)
Jan 26 15:40:43 compute-1 nova_compute[183403]: 2026-01-26 15:40:43.170 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:40:47 compute-1 nova_compute[183403]: 2026-01-26 15:40:47.246 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:40:47 compute-1 nova_compute[183403]: 2026-01-26 15:40:47.573 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:40:48 compute-1 nova_compute[183403]: 2026-01-26 15:40:48.174 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:40:49 compute-1 openstack_network_exporter[195610]: ERROR   15:40:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:40:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:40:49 compute-1 openstack_network_exporter[195610]: ERROR   15:40:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:40:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:40:52 compute-1 nova_compute[183403]: 2026-01-26 15:40:52.250 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:40:53 compute-1 nova_compute[183403]: 2026-01-26 15:40:53.176 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:40:53 compute-1 podman[215492]: 2026-01-26 15:40:53.922458055 +0000 UTC m=+0.081840421 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120)
Jan 26 15:40:53 compute-1 podman[215491]: 2026-01-26 15:40:53.93692537 +0000 UTC m=+0.110069582 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 26 15:40:57 compute-1 nova_compute[183403]: 2026-01-26 15:40:57.254 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:40:58 compute-1 nova_compute[183403]: 2026-01-26 15:40:58.180 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:41:02 compute-1 nova_compute[183403]: 2026-01-26 15:41:02.256 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:41:03 compute-1 nova_compute[183403]: 2026-01-26 15:41:03.182 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:41:05 compute-1 podman[192725]: time="2026-01-26T15:41:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:41:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:41:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:41:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:41:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2195 "" "Go-http-client/1.1"
Jan 26 15:41:06 compute-1 nova_compute[183403]: 2026-01-26 15:41:06.701 183407 DEBUG oslo_concurrency.lockutils [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Acquiring lock "71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:41:06 compute-1 nova_compute[183403]: 2026-01-26 15:41:06.702 183407 DEBUG oslo_concurrency.lockutils [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Lock "71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:41:07 compute-1 nova_compute[183403]: 2026-01-26 15:41:07.209 183407 DEBUG nova.compute.manager [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Jan 26 15:41:07 compute-1 nova_compute[183403]: 2026-01-26 15:41:07.257 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:41:07 compute-1 nova_compute[183403]: 2026-01-26 15:41:07.754 183407 DEBUG oslo_concurrency.lockutils [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:41:07 compute-1 nova_compute[183403]: 2026-01-26 15:41:07.755 183407 DEBUG oslo_concurrency.lockutils [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:41:07 compute-1 nova_compute[183403]: 2026-01-26 15:41:07.763 183407 DEBUG nova.virt.hardware [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Jan 26 15:41:07 compute-1 nova_compute[183403]: 2026-01-26 15:41:07.764 183407 INFO nova.compute.claims [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Claim successful on node compute-1.ctlplane.example.com
Jan 26 15:41:08 compute-1 nova_compute[183403]: 2026-01-26 15:41:08.184 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:41:08 compute-1 nova_compute[183403]: 2026-01-26 15:41:08.821 183407 DEBUG nova.compute.provider_tree [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:41:09 compute-1 nova_compute[183403]: 2026-01-26 15:41:09.395 183407 DEBUG nova.scheduler.client.report [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:41:10 compute-1 nova_compute[183403]: 2026-01-26 15:41:10.061 183407 DEBUG oslo_concurrency.lockutils [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.306s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:41:10 compute-1 nova_compute[183403]: 2026-01-26 15:41:10.063 183407 DEBUG nova.compute.manager [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Jan 26 15:41:10 compute-1 nova_compute[183403]: 2026-01-26 15:41:10.576 183407 DEBUG nova.compute.manager [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Jan 26 15:41:10 compute-1 nova_compute[183403]: 2026-01-26 15:41:10.577 183407 DEBUG nova.network.neutron [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Jan 26 15:41:10 compute-1 nova_compute[183403]: 2026-01-26 15:41:10.578 183407 WARNING neutronclient.v2_0.client [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:41:10 compute-1 nova_compute[183403]: 2026-01-26 15:41:10.578 183407 WARNING neutronclient.v2_0.client [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:41:11 compute-1 nova_compute[183403]: 2026-01-26 15:41:11.084 183407 INFO nova.virt.libvirt.driver [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 15:41:11 compute-1 nova_compute[183403]: 2026-01-26 15:41:11.669 183407 DEBUG nova.compute.manager [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Jan 26 15:41:12 compute-1 nova_compute[183403]: 2026-01-26 15:41:12.259 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:41:12 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:41:12.579 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=37, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:ac:38', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'de:96:40:90:f3:e5'}, ipsec=False) old=SB_Global(nb_cfg=36) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:41:12 compute-1 nova_compute[183403]: 2026-01-26 15:41:12.579 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:41:12 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:41:12.580 104930 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 15:41:12 compute-1 nova_compute[183403]: 2026-01-26 15:41:12.716 183407 DEBUG nova.compute.manager [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Jan 26 15:41:12 compute-1 nova_compute[183403]: 2026-01-26 15:41:12.717 183407 DEBUG nova.virt.libvirt.driver [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Jan 26 15:41:12 compute-1 nova_compute[183403]: 2026-01-26 15:41:12.718 183407 INFO nova.virt.libvirt.driver [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Creating image(s)
Jan 26 15:41:12 compute-1 nova_compute[183403]: 2026-01-26 15:41:12.718 183407 DEBUG oslo_concurrency.lockutils [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Acquiring lock "/var/lib/nova/instances/71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:41:12 compute-1 nova_compute[183403]: 2026-01-26 15:41:12.719 183407 DEBUG oslo_concurrency.lockutils [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Lock "/var/lib/nova/instances/71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:41:12 compute-1 nova_compute[183403]: 2026-01-26 15:41:12.719 183407 DEBUG oslo_concurrency.lockutils [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Lock "/var/lib/nova/instances/71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:41:12 compute-1 nova_compute[183403]: 2026-01-26 15:41:12.720 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:41:12 compute-1 nova_compute[183403]: 2026-01-26 15:41:12.725 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:41:12 compute-1 nova_compute[183403]: 2026-01-26 15:41:12.727 183407 DEBUG oslo_concurrency.processutils [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:41:12 compute-1 nova_compute[183403]: 2026-01-26 15:41:12.761 183407 DEBUG nova.network.neutron [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Successfully created port: b895845f-25a0-49ab-8be5-082a63b18d3d _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Jan 26 15:41:12 compute-1 nova_compute[183403]: 2026-01-26 15:41:12.782 183407 DEBUG oslo_concurrency.processutils [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:41:12 compute-1 nova_compute[183403]: 2026-01-26 15:41:12.783 183407 DEBUG oslo_concurrency.lockutils [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Acquiring lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:41:12 compute-1 nova_compute[183403]: 2026-01-26 15:41:12.783 183407 DEBUG oslo_concurrency.lockutils [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:41:12 compute-1 nova_compute[183403]: 2026-01-26 15:41:12.784 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:41:12 compute-1 nova_compute[183403]: 2026-01-26 15:41:12.787 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:41:12 compute-1 nova_compute[183403]: 2026-01-26 15:41:12.787 183407 DEBUG oslo_concurrency.processutils [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:41:12 compute-1 nova_compute[183403]: 2026-01-26 15:41:12.848 183407 DEBUG oslo_concurrency.processutils [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:41:12 compute-1 nova_compute[183403]: 2026-01-26 15:41:12.849 183407 DEBUG oslo_concurrency.processutils [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0,backing_fmt=raw /var/lib/nova/instances/71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:41:12 compute-1 nova_compute[183403]: 2026-01-26 15:41:12.897 183407 DEBUG oslo_concurrency.processutils [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0,backing_fmt=raw /var/lib/nova/instances/71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8/disk 1073741824" returned: 0 in 0.048s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:41:12 compute-1 nova_compute[183403]: 2026-01-26 15:41:12.897 183407 DEBUG oslo_concurrency.lockutils [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.114s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:41:12 compute-1 nova_compute[183403]: 2026-01-26 15:41:12.898 183407 DEBUG oslo_concurrency.processutils [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:41:12 compute-1 nova_compute[183403]: 2026-01-26 15:41:12.957 183407 DEBUG oslo_concurrency.processutils [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:41:12 compute-1 nova_compute[183403]: 2026-01-26 15:41:12.958 183407 DEBUG nova.virt.disk.api [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Checking if we can resize image /var/lib/nova/instances/71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 26 15:41:12 compute-1 nova_compute[183403]: 2026-01-26 15:41:12.958 183407 DEBUG oslo_concurrency.processutils [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:41:13 compute-1 nova_compute[183403]: 2026-01-26 15:41:13.011 183407 DEBUG oslo_concurrency.processutils [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:41:13 compute-1 nova_compute[183403]: 2026-01-26 15:41:13.013 183407 DEBUG nova.virt.disk.api [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Cannot resize image /var/lib/nova/instances/71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 26 15:41:13 compute-1 nova_compute[183403]: 2026-01-26 15:41:13.013 183407 DEBUG nova.virt.libvirt.driver [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Jan 26 15:41:13 compute-1 nova_compute[183403]: 2026-01-26 15:41:13.014 183407 DEBUG nova.virt.libvirt.driver [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Ensure instance console log exists: /var/lib/nova/instances/71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Jan 26 15:41:13 compute-1 nova_compute[183403]: 2026-01-26 15:41:13.014 183407 DEBUG oslo_concurrency.lockutils [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:41:13 compute-1 nova_compute[183403]: 2026-01-26 15:41:13.014 183407 DEBUG oslo_concurrency.lockutils [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:41:13 compute-1 nova_compute[183403]: 2026-01-26 15:41:13.015 183407 DEBUG oslo_concurrency.lockutils [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:41:13 compute-1 nova_compute[183403]: 2026-01-26 15:41:13.187 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:41:13 compute-1 nova_compute[183403]: 2026-01-26 15:41:13.631 183407 DEBUG nova.network.neutron [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Successfully updated port: b895845f-25a0-49ab-8be5-082a63b18d3d _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Jan 26 15:41:13 compute-1 nova_compute[183403]: 2026-01-26 15:41:13.691 183407 DEBUG nova.compute.manager [req-ef5c5e12-84b8-4e35-84e7-c0f98c2de3b7 req-ddaf03f2-229a-4aaf-8062-a446b9b83cdc 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Received event network-changed-b895845f-25a0-49ab-8be5-082a63b18d3d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:41:13 compute-1 nova_compute[183403]: 2026-01-26 15:41:13.692 183407 DEBUG nova.compute.manager [req-ef5c5e12-84b8-4e35-84e7-c0f98c2de3b7 req-ddaf03f2-229a-4aaf-8062-a446b9b83cdc 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Refreshing instance network info cache due to event network-changed-b895845f-25a0-49ab-8be5-082a63b18d3d. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 26 15:41:13 compute-1 nova_compute[183403]: 2026-01-26 15:41:13.692 183407 DEBUG oslo_concurrency.lockutils [req-ef5c5e12-84b8-4e35-84e7-c0f98c2de3b7 req-ddaf03f2-229a-4aaf-8062-a446b9b83cdc 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "refresh_cache-71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 15:41:13 compute-1 nova_compute[183403]: 2026-01-26 15:41:13.692 183407 DEBUG oslo_concurrency.lockutils [req-ef5c5e12-84b8-4e35-84e7-c0f98c2de3b7 req-ddaf03f2-229a-4aaf-8062-a446b9b83cdc 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquired lock "refresh_cache-71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 15:41:13 compute-1 nova_compute[183403]: 2026-01-26 15:41:13.692 183407 DEBUG nova.network.neutron [req-ef5c5e12-84b8-4e35-84e7-c0f98c2de3b7 req-ddaf03f2-229a-4aaf-8062-a446b9b83cdc 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Refreshing network info cache for port b895845f-25a0-49ab-8be5-082a63b18d3d _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 26 15:41:13 compute-1 podman[215555]: 2026-01-26 15:41:13.88823961 +0000 UTC m=+0.061079943 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, release=1755695350, vendor=Red Hat, Inc., distribution-scope=public, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., name=ubi9-minimal, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 26 15:41:13 compute-1 podman[215554]: 2026-01-26 15:41:13.895406836 +0000 UTC m=+0.069913182 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 26 15:41:14 compute-1 nova_compute[183403]: 2026-01-26 15:41:14.139 183407 DEBUG oslo_concurrency.lockutils [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Acquiring lock "refresh_cache-71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 15:41:14 compute-1 nova_compute[183403]: 2026-01-26 15:41:14.198 183407 WARNING neutronclient.v2_0.client [req-ef5c5e12-84b8-4e35-84e7-c0f98c2de3b7 req-ddaf03f2-229a-4aaf-8062-a446b9b83cdc 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:41:14 compute-1 nova_compute[183403]: 2026-01-26 15:41:14.532 183407 DEBUG nova.network.neutron [req-ef5c5e12-84b8-4e35-84e7-c0f98c2de3b7 req-ddaf03f2-229a-4aaf-8062-a446b9b83cdc 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 15:41:14 compute-1 nova_compute[183403]: 2026-01-26 15:41:14.681 183407 DEBUG nova.network.neutron [req-ef5c5e12-84b8-4e35-84e7-c0f98c2de3b7 req-ddaf03f2-229a-4aaf-8062-a446b9b83cdc 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:41:15 compute-1 nova_compute[183403]: 2026-01-26 15:41:15.349 183407 DEBUG oslo_concurrency.lockutils [req-ef5c5e12-84b8-4e35-84e7-c0f98c2de3b7 req-ddaf03f2-229a-4aaf-8062-a446b9b83cdc 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Releasing lock "refresh_cache-71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 15:41:15 compute-1 nova_compute[183403]: 2026-01-26 15:41:15.350 183407 DEBUG oslo_concurrency.lockutils [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Acquired lock "refresh_cache-71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 15:41:15 compute-1 nova_compute[183403]: 2026-01-26 15:41:15.350 183407 DEBUG nova.network.neutron [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 15:41:15 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:41:15.582 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=41380b5a-e321-4ce4-bcc6-ecd563b3c793, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '37'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:41:16 compute-1 nova_compute[183403]: 2026-01-26 15:41:16.556 183407 DEBUG nova.network.neutron [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 15:41:16 compute-1 nova_compute[183403]: 2026-01-26 15:41:16.828 183407 WARNING neutronclient.v2_0.client [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:41:16 compute-1 nova_compute[183403]: 2026-01-26 15:41:16.987 183407 DEBUG nova.network.neutron [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Updating instance_info_cache with network_info: [{"id": "b895845f-25a0-49ab-8be5-082a63b18d3d", "address": "fa:16:3e:22:59:61", "network": {"id": "d9d38847-a43b-4d1e-a0b1-c3f77a879374", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1542262107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "312beda09adb420b9f44a490d7257008", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb895845f-25", "ovs_interfaceid": "b895845f-25a0-49ab-8be5-082a63b18d3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:41:17 compute-1 nova_compute[183403]: 2026-01-26 15:41:17.261 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:41:17 compute-1 nova_compute[183403]: 2026-01-26 15:41:17.494 183407 DEBUG oslo_concurrency.lockutils [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Releasing lock "refresh_cache-71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 15:41:17 compute-1 nova_compute[183403]: 2026-01-26 15:41:17.495 183407 DEBUG nova.compute.manager [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Instance network_info: |[{"id": "b895845f-25a0-49ab-8be5-082a63b18d3d", "address": "fa:16:3e:22:59:61", "network": {"id": "d9d38847-a43b-4d1e-a0b1-c3f77a879374", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1542262107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "312beda09adb420b9f44a490d7257008", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb895845f-25", "ovs_interfaceid": "b895845f-25a0-49ab-8be5-082a63b18d3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Jan 26 15:41:17 compute-1 nova_compute[183403]: 2026-01-26 15:41:17.498 183407 DEBUG nova.virt.libvirt.driver [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Start _get_guest_xml network_info=[{"id": "b895845f-25a0-49ab-8be5-082a63b18d3d", "address": "fa:16:3e:22:59:61", "network": {"id": "d9d38847-a43b-4d1e-a0b1-c3f77a879374", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1542262107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "312beda09adb420b9f44a490d7257008", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb895845f-25", "ovs_interfaceid": "b895845f-25a0-49ab-8be5-082a63b18d3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T15:01:22Z,direct_url=<?>,disk_format='qcow2',id=354e4d0e-4287-404f-93d3-2c85cfe92fbc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='179f3c996d8f4e7ea1b0aca3ec76f02e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T15:01:23Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_options': None, 'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'image_id': '354e4d0e-4287-404f-93d3-2c85cfe92fbc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Jan 26 15:41:17 compute-1 nova_compute[183403]: 2026-01-26 15:41:17.503 183407 WARNING nova.virt.libvirt.driver [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:41:17 compute-1 nova_compute[183403]: 2026-01-26 15:41:17.505 183407 DEBUG nova.virt.driver [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='354e4d0e-4287-404f-93d3-2c85cfe92fbc', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteZoneMigrationStrategy-server-1987266720', uuid='71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8'), owner=OwnerMeta(userid='136d3cfdd6cb48e2ab65221bcc05d26c', username='tempest-TestExecuteZoneMigrationStrategy-1233966703-project-admin', projectid='af8eea38f1d74ad1a01087c020ea8d02', projectname='tempest-TestExecuteZoneMigrationStrategy-1233966703'), image=ImageMeta(id='354e4d0e-4287-404f-93d3-2c85cfe92fbc', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='74480e15-23e6-4569-8ef9-3ddf5ac8b981', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "b895845f-25a0-49ab-8be5-082a63b18d3d", "address": "fa:16:3e:22:59:61", "network": {"id": "d9d38847-a43b-4d1e-a0b1-c3f77a879374", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1542262107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "312beda09adb420b9f44a490d7257008", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb895845f-25", "ovs_interfaceid": "b895845f-25a0-49ab-8be5-082a63b18d3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1769442077.505434) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Jan 26 15:41:17 compute-1 nova_compute[183403]: 2026-01-26 15:41:17.512 183407 DEBUG nova.virt.libvirt.host [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Jan 26 15:41:17 compute-1 nova_compute[183403]: 2026-01-26 15:41:17.513 183407 DEBUG nova.virt.libvirt.host [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Jan 26 15:41:17 compute-1 nova_compute[183403]: 2026-01-26 15:41:17.517 183407 DEBUG nova.virt.libvirt.host [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Jan 26 15:41:17 compute-1 nova_compute[183403]: 2026-01-26 15:41:17.517 183407 DEBUG nova.virt.libvirt.host [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Jan 26 15:41:17 compute-1 nova_compute[183403]: 2026-01-26 15:41:17.519 183407 DEBUG nova.virt.libvirt.driver [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Jan 26 15:41:17 compute-1 nova_compute[183403]: 2026-01-26 15:41:17.519 183407 DEBUG nova.virt.hardware [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T15:01:21Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='74480e15-23e6-4569-8ef9-3ddf5ac8b981',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T15:01:22Z,direct_url=<?>,disk_format='qcow2',id=354e4d0e-4287-404f-93d3-2c85cfe92fbc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='179f3c996d8f4e7ea1b0aca3ec76f02e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T15:01:23Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Jan 26 15:41:17 compute-1 nova_compute[183403]: 2026-01-26 15:41:17.520 183407 DEBUG nova.virt.hardware [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Jan 26 15:41:17 compute-1 nova_compute[183403]: 2026-01-26 15:41:17.520 183407 DEBUG nova.virt.hardware [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Jan 26 15:41:17 compute-1 nova_compute[183403]: 2026-01-26 15:41:17.521 183407 DEBUG nova.virt.hardware [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Jan 26 15:41:17 compute-1 nova_compute[183403]: 2026-01-26 15:41:17.521 183407 DEBUG nova.virt.hardware [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Jan 26 15:41:17 compute-1 nova_compute[183403]: 2026-01-26 15:41:17.521 183407 DEBUG nova.virt.hardware [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Jan 26 15:41:17 compute-1 nova_compute[183403]: 2026-01-26 15:41:17.521 183407 DEBUG nova.virt.hardware [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Jan 26 15:41:17 compute-1 nova_compute[183403]: 2026-01-26 15:41:17.522 183407 DEBUG nova.virt.hardware [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Jan 26 15:41:17 compute-1 nova_compute[183403]: 2026-01-26 15:41:17.522 183407 DEBUG nova.virt.hardware [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Jan 26 15:41:17 compute-1 nova_compute[183403]: 2026-01-26 15:41:17.522 183407 DEBUG nova.virt.hardware [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Jan 26 15:41:17 compute-1 nova_compute[183403]: 2026-01-26 15:41:17.523 183407 DEBUG nova.virt.hardware [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Jan 26 15:41:17 compute-1 nova_compute[183403]: 2026-01-26 15:41:17.529 183407 DEBUG nova.virt.libvirt.vif [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-01-26T15:41:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-1987266720',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-1987266720',id=33,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='af8eea38f1d74ad1a01087c020ea8d02',ramdisk_id='',reservation_id='r-vyy0mpry',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-1233966703',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-1233966703-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T15:41:11Z,user_data=None,user_id='136d3cfdd6cb48e2ab65221bcc05d26c',uuid=71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b895845f-25a0-49ab-8be5-082a63b18d3d", "address": "fa:16:3e:22:59:61", "network": {"id": "d9d38847-a43b-4d1e-a0b1-c3f77a879374", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1542262107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "312beda09adb420b9f44a490d7257008", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb895845f-25", "ovs_interfaceid": "b895845f-25a0-49ab-8be5-082a63b18d3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 26 15:41:17 compute-1 nova_compute[183403]: 2026-01-26 15:41:17.529 183407 DEBUG nova.network.os_vif_util [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Converting VIF {"id": "b895845f-25a0-49ab-8be5-082a63b18d3d", "address": "fa:16:3e:22:59:61", "network": {"id": "d9d38847-a43b-4d1e-a0b1-c3f77a879374", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1542262107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "312beda09adb420b9f44a490d7257008", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb895845f-25", "ovs_interfaceid": "b895845f-25a0-49ab-8be5-082a63b18d3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:41:17 compute-1 nova_compute[183403]: 2026-01-26 15:41:17.530 183407 DEBUG nova.network.os_vif_util [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:22:59:61,bridge_name='br-int',has_traffic_filtering=True,id=b895845f-25a0-49ab-8be5-082a63b18d3d,network=Network(d9d38847-a43b-4d1e-a0b1-c3f77a879374),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb895845f-25') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:41:17 compute-1 nova_compute[183403]: 2026-01-26 15:41:17.532 183407 DEBUG nova.objects.instance [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Lazy-loading 'pci_devices' on Instance uuid 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:41:18 compute-1 nova_compute[183403]: 2026-01-26 15:41:18.040 183407 DEBUG nova.virt.libvirt.driver [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] End _get_guest_xml xml=<domain type="kvm">
Jan 26 15:41:18 compute-1 nova_compute[183403]:   <uuid>71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8</uuid>
Jan 26 15:41:18 compute-1 nova_compute[183403]:   <name>instance-00000021</name>
Jan 26 15:41:18 compute-1 nova_compute[183403]:   <memory>131072</memory>
Jan 26 15:41:18 compute-1 nova_compute[183403]:   <vcpu>1</vcpu>
Jan 26 15:41:18 compute-1 nova_compute[183403]:   <metadata>
Jan 26 15:41:18 compute-1 nova_compute[183403]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 15:41:18 compute-1 nova_compute[183403]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 15:41:18 compute-1 nova_compute[183403]:       <nova:name>tempest-TestExecuteZoneMigrationStrategy-server-1987266720</nova:name>
Jan 26 15:41:18 compute-1 nova_compute[183403]:       <nova:creationTime>2026-01-26 15:41:17</nova:creationTime>
Jan 26 15:41:18 compute-1 nova_compute[183403]:       <nova:flavor name="m1.nano" id="74480e15-23e6-4569-8ef9-3ddf5ac8b981">
Jan 26 15:41:18 compute-1 nova_compute[183403]:         <nova:memory>128</nova:memory>
Jan 26 15:41:18 compute-1 nova_compute[183403]:         <nova:disk>1</nova:disk>
Jan 26 15:41:18 compute-1 nova_compute[183403]:         <nova:swap>0</nova:swap>
Jan 26 15:41:18 compute-1 nova_compute[183403]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 15:41:18 compute-1 nova_compute[183403]:         <nova:vcpus>1</nova:vcpus>
Jan 26 15:41:18 compute-1 nova_compute[183403]:         <nova:extraSpecs>
Jan 26 15:41:18 compute-1 nova_compute[183403]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 15:41:18 compute-1 nova_compute[183403]:         </nova:extraSpecs>
Jan 26 15:41:18 compute-1 nova_compute[183403]:       </nova:flavor>
Jan 26 15:41:18 compute-1 nova_compute[183403]:       <nova:image uuid="354e4d0e-4287-404f-93d3-2c85cfe92fbc">
Jan 26 15:41:18 compute-1 nova_compute[183403]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 15:41:18 compute-1 nova_compute[183403]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 15:41:18 compute-1 nova_compute[183403]:         <nova:minDisk>1</nova:minDisk>
Jan 26 15:41:18 compute-1 nova_compute[183403]:         <nova:minRam>0</nova:minRam>
Jan 26 15:41:18 compute-1 nova_compute[183403]:         <nova:properties>
Jan 26 15:41:18 compute-1 nova_compute[183403]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 15:41:18 compute-1 nova_compute[183403]:         </nova:properties>
Jan 26 15:41:18 compute-1 nova_compute[183403]:       </nova:image>
Jan 26 15:41:18 compute-1 nova_compute[183403]:       <nova:owner>
Jan 26 15:41:18 compute-1 nova_compute[183403]:         <nova:user uuid="136d3cfdd6cb48e2ab65221bcc05d26c">tempest-TestExecuteZoneMigrationStrategy-1233966703-project-admin</nova:user>
Jan 26 15:41:18 compute-1 nova_compute[183403]:         <nova:project uuid="af8eea38f1d74ad1a01087c020ea8d02">tempest-TestExecuteZoneMigrationStrategy-1233966703</nova:project>
Jan 26 15:41:18 compute-1 nova_compute[183403]:       </nova:owner>
Jan 26 15:41:18 compute-1 nova_compute[183403]:       <nova:root type="image" uuid="354e4d0e-4287-404f-93d3-2c85cfe92fbc"/>
Jan 26 15:41:18 compute-1 nova_compute[183403]:       <nova:ports>
Jan 26 15:41:18 compute-1 nova_compute[183403]:         <nova:port uuid="b895845f-25a0-49ab-8be5-082a63b18d3d">
Jan 26 15:41:18 compute-1 nova_compute[183403]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 26 15:41:18 compute-1 nova_compute[183403]:         </nova:port>
Jan 26 15:41:18 compute-1 nova_compute[183403]:       </nova:ports>
Jan 26 15:41:18 compute-1 nova_compute[183403]:     </nova:instance>
Jan 26 15:41:18 compute-1 nova_compute[183403]:   </metadata>
Jan 26 15:41:18 compute-1 nova_compute[183403]:   <sysinfo type="smbios">
Jan 26 15:41:18 compute-1 nova_compute[183403]:     <system>
Jan 26 15:41:18 compute-1 nova_compute[183403]:       <entry name="manufacturer">RDO</entry>
Jan 26 15:41:18 compute-1 nova_compute[183403]:       <entry name="product">OpenStack Compute</entry>
Jan 26 15:41:18 compute-1 nova_compute[183403]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 15:41:18 compute-1 nova_compute[183403]:       <entry name="serial">71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8</entry>
Jan 26 15:41:18 compute-1 nova_compute[183403]:       <entry name="uuid">71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8</entry>
Jan 26 15:41:18 compute-1 nova_compute[183403]:       <entry name="family">Virtual Machine</entry>
Jan 26 15:41:18 compute-1 nova_compute[183403]:     </system>
Jan 26 15:41:18 compute-1 nova_compute[183403]:   </sysinfo>
Jan 26 15:41:18 compute-1 nova_compute[183403]:   <os>
Jan 26 15:41:18 compute-1 nova_compute[183403]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 15:41:18 compute-1 nova_compute[183403]:     <boot dev="hd"/>
Jan 26 15:41:18 compute-1 nova_compute[183403]:     <smbios mode="sysinfo"/>
Jan 26 15:41:18 compute-1 nova_compute[183403]:   </os>
Jan 26 15:41:18 compute-1 nova_compute[183403]:   <features>
Jan 26 15:41:18 compute-1 nova_compute[183403]:     <acpi/>
Jan 26 15:41:18 compute-1 nova_compute[183403]:     <apic/>
Jan 26 15:41:18 compute-1 nova_compute[183403]:     <vmcoreinfo/>
Jan 26 15:41:18 compute-1 nova_compute[183403]:   </features>
Jan 26 15:41:18 compute-1 nova_compute[183403]:   <clock offset="utc">
Jan 26 15:41:18 compute-1 nova_compute[183403]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 15:41:18 compute-1 nova_compute[183403]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 15:41:18 compute-1 nova_compute[183403]:     <timer name="hpet" present="no"/>
Jan 26 15:41:18 compute-1 nova_compute[183403]:   </clock>
Jan 26 15:41:18 compute-1 nova_compute[183403]:   <cpu mode="custom" match="exact">
Jan 26 15:41:18 compute-1 nova_compute[183403]:     <model>Nehalem</model>
Jan 26 15:41:18 compute-1 nova_compute[183403]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 15:41:18 compute-1 nova_compute[183403]:   </cpu>
Jan 26 15:41:18 compute-1 nova_compute[183403]:   <devices>
Jan 26 15:41:18 compute-1 nova_compute[183403]:     <disk type="file" device="disk">
Jan 26 15:41:18 compute-1 nova_compute[183403]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 15:41:18 compute-1 nova_compute[183403]:       <source file="/var/lib/nova/instances/71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8/disk"/>
Jan 26 15:41:18 compute-1 nova_compute[183403]:       <target dev="vda" bus="virtio"/>
Jan 26 15:41:18 compute-1 nova_compute[183403]:     </disk>
Jan 26 15:41:18 compute-1 nova_compute[183403]:     <disk type="file" device="cdrom">
Jan 26 15:41:18 compute-1 nova_compute[183403]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 15:41:18 compute-1 nova_compute[183403]:       <source file="/var/lib/nova/instances/71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8/disk.config"/>
Jan 26 15:41:18 compute-1 nova_compute[183403]:       <target dev="sda" bus="sata"/>
Jan 26 15:41:18 compute-1 nova_compute[183403]:     </disk>
Jan 26 15:41:18 compute-1 nova_compute[183403]:     <interface type="ethernet">
Jan 26 15:41:18 compute-1 nova_compute[183403]:       <mac address="fa:16:3e:22:59:61"/>
Jan 26 15:41:18 compute-1 nova_compute[183403]:       <model type="virtio"/>
Jan 26 15:41:18 compute-1 nova_compute[183403]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 15:41:18 compute-1 nova_compute[183403]:       <mtu size="1442"/>
Jan 26 15:41:18 compute-1 nova_compute[183403]:       <target dev="tapb895845f-25"/>
Jan 26 15:41:18 compute-1 nova_compute[183403]:     </interface>
Jan 26 15:41:18 compute-1 nova_compute[183403]:     <serial type="pty">
Jan 26 15:41:18 compute-1 nova_compute[183403]:       <log file="/var/lib/nova/instances/71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8/console.log" append="off"/>
Jan 26 15:41:18 compute-1 nova_compute[183403]:     </serial>
Jan 26 15:41:18 compute-1 nova_compute[183403]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 15:41:18 compute-1 nova_compute[183403]:     <video>
Jan 26 15:41:18 compute-1 nova_compute[183403]:       <model type="virtio"/>
Jan 26 15:41:18 compute-1 nova_compute[183403]:     </video>
Jan 26 15:41:18 compute-1 nova_compute[183403]:     <input type="tablet" bus="usb"/>
Jan 26 15:41:18 compute-1 nova_compute[183403]:     <rng model="virtio">
Jan 26 15:41:18 compute-1 nova_compute[183403]:       <backend model="random">/dev/urandom</backend>
Jan 26 15:41:18 compute-1 nova_compute[183403]:     </rng>
Jan 26 15:41:18 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root"/>
Jan 26 15:41:18 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:41:18 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:41:18 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:41:18 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:41:18 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:41:18 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:41:18 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:41:18 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:41:18 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:41:18 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:41:18 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:41:18 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:41:18 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:41:18 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:41:18 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:41:18 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:41:18 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:41:18 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:41:18 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:41:18 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:41:18 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:41:18 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:41:18 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:41:18 compute-1 nova_compute[183403]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 15:41:18 compute-1 nova_compute[183403]:     <controller type="usb" index="0"/>
Jan 26 15:41:18 compute-1 nova_compute[183403]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 15:41:18 compute-1 nova_compute[183403]:       <stats period="10"/>
Jan 26 15:41:18 compute-1 nova_compute[183403]:     </memballoon>
Jan 26 15:41:18 compute-1 nova_compute[183403]:   </devices>
Jan 26 15:41:18 compute-1 nova_compute[183403]: </domain>
Jan 26 15:41:18 compute-1 nova_compute[183403]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Jan 26 15:41:18 compute-1 nova_compute[183403]: 2026-01-26 15:41:18.041 183407 DEBUG nova.compute.manager [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Preparing to wait for external event network-vif-plugged-b895845f-25a0-49ab-8be5-082a63b18d3d prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 26 15:41:18 compute-1 nova_compute[183403]: 2026-01-26 15:41:18.041 183407 DEBUG oslo_concurrency.lockutils [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Acquiring lock "71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:41:18 compute-1 nova_compute[183403]: 2026-01-26 15:41:18.041 183407 DEBUG oslo_concurrency.lockutils [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Lock "71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:41:18 compute-1 nova_compute[183403]: 2026-01-26 15:41:18.042 183407 DEBUG oslo_concurrency.lockutils [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Lock "71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:41:18 compute-1 nova_compute[183403]: 2026-01-26 15:41:18.042 183407 DEBUG nova.virt.libvirt.vif [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-01-26T15:41:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-1987266720',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-1987266720',id=33,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='af8eea38f1d74ad1a01087c020ea8d02',ramdisk_id='',reservation_id='r-vyy0mpry',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-1233966703',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-1233966703-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T15:41:11Z,user_data=None,user_id='136d3cfdd6cb48e2ab65221bcc05d26c',uuid=71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b895845f-25a0-49ab-8be5-082a63b18d3d", "address": "fa:16:3e:22:59:61", "network": {"id": "d9d38847-a43b-4d1e-a0b1-c3f77a879374", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1542262107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "312beda09adb420b9f44a490d7257008", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb895845f-25", "ovs_interfaceid": "b895845f-25a0-49ab-8be5-082a63b18d3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 26 15:41:18 compute-1 nova_compute[183403]: 2026-01-26 15:41:18.043 183407 DEBUG nova.network.os_vif_util [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Converting VIF {"id": "b895845f-25a0-49ab-8be5-082a63b18d3d", "address": "fa:16:3e:22:59:61", "network": {"id": "d9d38847-a43b-4d1e-a0b1-c3f77a879374", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1542262107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "312beda09adb420b9f44a490d7257008", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb895845f-25", "ovs_interfaceid": "b895845f-25a0-49ab-8be5-082a63b18d3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:41:18 compute-1 nova_compute[183403]: 2026-01-26 15:41:18.043 183407 DEBUG nova.network.os_vif_util [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:22:59:61,bridge_name='br-int',has_traffic_filtering=True,id=b895845f-25a0-49ab-8be5-082a63b18d3d,network=Network(d9d38847-a43b-4d1e-a0b1-c3f77a879374),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb895845f-25') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:41:18 compute-1 nova_compute[183403]: 2026-01-26 15:41:18.044 183407 DEBUG os_vif [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:22:59:61,bridge_name='br-int',has_traffic_filtering=True,id=b895845f-25a0-49ab-8be5-082a63b18d3d,network=Network(d9d38847-a43b-4d1e-a0b1-c3f77a879374),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb895845f-25') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 26 15:41:18 compute-1 nova_compute[183403]: 2026-01-26 15:41:18.044 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:41:18 compute-1 nova_compute[183403]: 2026-01-26 15:41:18.044 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:41:18 compute-1 nova_compute[183403]: 2026-01-26 15:41:18.045 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:41:18 compute-1 nova_compute[183403]: 2026-01-26 15:41:18.045 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:41:18 compute-1 nova_compute[183403]: 2026-01-26 15:41:18.046 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '32b8ead0-6900-54cf-936d-72ac276b7ef5', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:41:18 compute-1 nova_compute[183403]: 2026-01-26 15:41:18.047 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:41:18 compute-1 nova_compute[183403]: 2026-01-26 15:41:18.049 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:41:18 compute-1 nova_compute[183403]: 2026-01-26 15:41:18.051 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:41:18 compute-1 nova_compute[183403]: 2026-01-26 15:41:18.052 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb895845f-25, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:41:18 compute-1 nova_compute[183403]: 2026-01-26 15:41:18.052 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapb895845f-25, col_values=(('qos', UUID('eef71478-4552-4cfa-9c7f-3d311c99d418')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:41:18 compute-1 nova_compute[183403]: 2026-01-26 15:41:18.052 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapb895845f-25, col_values=(('external_ids', {'iface-id': 'b895845f-25a0-49ab-8be5-082a63b18d3d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:22:59:61', 'vm-uuid': '71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:41:18 compute-1 nova_compute[183403]: 2026-01-26 15:41:18.053 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:41:18 compute-1 NetworkManager[55716]: <info>  [1769442078.0545] manager: (tapb895845f-25): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/93)
Jan 26 15:41:18 compute-1 nova_compute[183403]: 2026-01-26 15:41:18.055 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:41:18 compute-1 nova_compute[183403]: 2026-01-26 15:41:18.060 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:41:18 compute-1 nova_compute[183403]: 2026-01-26 15:41:18.061 183407 INFO os_vif [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:22:59:61,bridge_name='br-int',has_traffic_filtering=True,id=b895845f-25a0-49ab-8be5-082a63b18d3d,network=Network(d9d38847-a43b-4d1e-a0b1-c3f77a879374),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb895845f-25')
Jan 26 15:41:19 compute-1 openstack_network_exporter[195610]: ERROR   15:41:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:41:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:41:19 compute-1 openstack_network_exporter[195610]: ERROR   15:41:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:41:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:41:19 compute-1 nova_compute[183403]: 2026-01-26 15:41:19.733 183407 DEBUG nova.virt.libvirt.driver [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 15:41:19 compute-1 nova_compute[183403]: 2026-01-26 15:41:19.734 183407 DEBUG nova.virt.libvirt.driver [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 15:41:19 compute-1 nova_compute[183403]: 2026-01-26 15:41:19.734 183407 DEBUG nova.virt.libvirt.driver [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] No VIF found with MAC fa:16:3e:22:59:61, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Jan 26 15:41:19 compute-1 nova_compute[183403]: 2026-01-26 15:41:19.735 183407 INFO nova.virt.libvirt.driver [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Using config drive
Jan 26 15:41:20 compute-1 nova_compute[183403]: 2026-01-26 15:41:20.248 183407 WARNING neutronclient.v2_0.client [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:41:20 compute-1 nova_compute[183403]: 2026-01-26 15:41:20.647 183407 INFO nova.virt.libvirt.driver [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Creating config drive at /var/lib/nova/instances/71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8/disk.config
Jan 26 15:41:20 compute-1 nova_compute[183403]: 2026-01-26 15:41:20.653 183407 DEBUG oslo_concurrency.processutils [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpbbzw024b execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:41:20 compute-1 nova_compute[183403]: 2026-01-26 15:41:20.795 183407 DEBUG oslo_concurrency.processutils [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpbbzw024b" returned: 0 in 0.142s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:41:20 compute-1 kernel: tapb895845f-25: entered promiscuous mode
Jan 26 15:41:20 compute-1 NetworkManager[55716]: <info>  [1769442080.8700] manager: (tapb895845f-25): new Tun device (/org/freedesktop/NetworkManager/Devices/94)
Jan 26 15:41:20 compute-1 ovn_controller[95641]: 2026-01-26T15:41:20Z|00254|binding|INFO|Claiming lport b895845f-25a0-49ab-8be5-082a63b18d3d for this chassis.
Jan 26 15:41:20 compute-1 nova_compute[183403]: 2026-01-26 15:41:20.870 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:41:20 compute-1 ovn_controller[95641]: 2026-01-26T15:41:20Z|00255|binding|INFO|b895845f-25a0-49ab-8be5-082a63b18d3d: Claiming fa:16:3e:22:59:61 10.100.0.9
Jan 26 15:41:20 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:41:20.878 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:22:59:61 10.100.0.9'], port_security=['fa:16:3e:22:59:61 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d9d38847-a43b-4d1e-a0b1-c3f77a879374', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'af8eea38f1d74ad1a01087c020ea8d02', 'neutron:revision_number': '4', 'neutron:security_group_ids': '41d84080-c1ba-42a6-8417-b57b30232ea3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f2b7e03b-5278-4177-91b7-862e57a7c9ad, chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], logical_port=b895845f-25a0-49ab-8be5-082a63b18d3d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:41:20 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:41:20.879 104930 INFO neutron.agent.ovn.metadata.agent [-] Port b895845f-25a0-49ab-8be5-082a63b18d3d in datapath d9d38847-a43b-4d1e-a0b1-c3f77a879374 bound to our chassis
Jan 26 15:41:20 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:41:20.880 104930 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d9d38847-a43b-4d1e-a0b1-c3f77a879374
Jan 26 15:41:20 compute-1 nova_compute[183403]: 2026-01-26 15:41:20.884 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:41:20 compute-1 ovn_controller[95641]: 2026-01-26T15:41:20Z|00256|binding|INFO|Setting lport b895845f-25a0-49ab-8be5-082a63b18d3d up in Southbound
Jan 26 15:41:20 compute-1 ovn_controller[95641]: 2026-01-26T15:41:20Z|00257|binding|INFO|Setting lport b895845f-25a0-49ab-8be5-082a63b18d3d ovn-installed in OVS
Jan 26 15:41:20 compute-1 nova_compute[183403]: 2026-01-26 15:41:20.888 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:41:20 compute-1 nova_compute[183403]: 2026-01-26 15:41:20.890 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:41:20 compute-1 nova_compute[183403]: 2026-01-26 15:41:20.892 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:41:20 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:41:20.894 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[0694daa1-8efd-4499-9a3f-fe847015ba20]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:41:20 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:41:20.896 104930 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd9d38847-a1 in ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Jan 26 15:41:20 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:41:20.899 203506 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd9d38847-a0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Jan 26 15:41:20 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:41:20.900 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[2f7c20fd-8945-41c0-a880-6c9234b48350]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:41:20 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:41:20.900 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[307f40c0-2df7-4300-b97e-65c2b167b33e]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:41:20 compute-1 systemd-udevd[215617]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 15:41:20 compute-1 systemd-machined[154697]: New machine qemu-24-instance-00000021.
Jan 26 15:41:20 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:41:20.917 105448 DEBUG oslo.privsep.daemon [-] privsep: reply[da29b16d-7927-4b80-900e-170959482524]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:41:20 compute-1 NetworkManager[55716]: <info>  [1769442080.9214] device (tapb895845f-25): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 15:41:20 compute-1 NetworkManager[55716]: <info>  [1769442080.9221] device (tapb895845f-25): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 15:41:20 compute-1 systemd[1]: Started Virtual Machine qemu-24-instance-00000021.
Jan 26 15:41:20 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:41:20.934 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[0b998bf5-f261-4587-a926-b132112c50f7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:41:20 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:41:20.965 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[0d3584c3-6b55-4511-b516-a1bc075a591e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:41:20 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:41:20.973 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[1ebc9400-710b-4242-a1af-211cac9a877c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:41:20 compute-1 NetworkManager[55716]: <info>  [1769442080.9739] manager: (tapd9d38847-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/95)
Jan 26 15:41:21 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:41:21.013 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[60efcd61-bbe4-415f-bd2e-1c54574f7905]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:41:21 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:41:21.015 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[d4f6b691-ad22-4370-a0b3-25f93ee00726]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:41:21 compute-1 NetworkManager[55716]: <info>  [1769442081.0453] device (tapd9d38847-a0): carrier: link connected
Jan 26 15:41:21 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:41:21.058 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[4bf0534a-911f-4b31-8633-a275233ba01c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:41:21 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:41:21.085 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[214ac440-2fb5-4983-bf1b-b529e9e7c31e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd9d38847-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:38:11:27'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 71], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 580364, 'reachable_time': 38520, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215651, 'error': None, 'target': 'ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:41:21 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:41:21.107 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[e2903c08-81a8-42ea-851a-31f357178cf7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe38:1127'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 580364, 'tstamp': 580364}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215652, 'error': None, 'target': 'ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:41:21 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:41:21.139 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[bf4f752d-0057-4e41-9095-5b1e97303c93]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd9d38847-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:38:11:27'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 71], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 580364, 'reachable_time': 38520, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215653, 'error': None, 'target': 'ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:41:21 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:41:21.196 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[6fc291fe-6c7c-4fc9-814a-d7195d9f9581]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:41:21 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:41:21.292 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[6da0c4af-f3e9-4627-a688-cc1adba10dfe]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:41:21 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:41:21.293 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd9d38847-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:41:21 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:41:21.293 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:41:21 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:41:21.293 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd9d38847-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:41:21 compute-1 nova_compute[183403]: 2026-01-26 15:41:21.331 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:41:21 compute-1 kernel: tapd9d38847-a0: entered promiscuous mode
Jan 26 15:41:21 compute-1 NetworkManager[55716]: <info>  [1769442081.3323] manager: (tapd9d38847-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/96)
Jan 26 15:41:21 compute-1 nova_compute[183403]: 2026-01-26 15:41:21.333 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:41:21 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:41:21.334 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd9d38847-a0, col_values=(('external_ids', {'iface-id': '70e6ec0a-21db-4d83-b0cb-0624424ede18'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:41:21 compute-1 nova_compute[183403]: 2026-01-26 15:41:21.335 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:41:21 compute-1 ovn_controller[95641]: 2026-01-26T15:41:21Z|00258|binding|INFO|Releasing lport 70e6ec0a-21db-4d83-b0cb-0624424ede18 from this chassis (sb_readonly=0)
Jan 26 15:41:21 compute-1 nova_compute[183403]: 2026-01-26 15:41:21.336 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:41:21 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:41:21.346 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[672b67f4-4ee3-4ad6-b740-d8f8e1748c36]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:41:21 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:41:21.346 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d9d38847-a43b-4d1e-a0b1-c3f77a879374.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d9d38847-a43b-4d1e-a0b1-c3f77a879374.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:41:21 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:41:21.346 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d9d38847-a43b-4d1e-a0b1-c3f77a879374.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d9d38847-a43b-4d1e-a0b1-c3f77a879374.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:41:21 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:41:21.346 104930 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for d9d38847-a43b-4d1e-a0b1-c3f77a879374 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Jan 26 15:41:21 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:41:21.347 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d9d38847-a43b-4d1e-a0b1-c3f77a879374.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d9d38847-a43b-4d1e-a0b1-c3f77a879374.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:41:21 compute-1 nova_compute[183403]: 2026-01-26 15:41:21.347 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:41:21 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:41:21.348 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[94c53762-fe89-466a-b4c1-eb74ba301c8c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:41:21 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:41:21.348 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d9d38847-a43b-4d1e-a0b1-c3f77a879374.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d9d38847-a43b-4d1e-a0b1-c3f77a879374.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:41:21 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:41:21.349 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[3add38dd-7df8-41ac-aef3-96f18e509b18]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:41:21 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:41:21.349 104930 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Jan 26 15:41:21 compute-1 ovn_metadata_agent[104924]: global
Jan 26 15:41:21 compute-1 ovn_metadata_agent[104924]:     log         /dev/log local0 debug
Jan 26 15:41:21 compute-1 ovn_metadata_agent[104924]:     log-tag     haproxy-metadata-proxy-d9d38847-a43b-4d1e-a0b1-c3f77a879374
Jan 26 15:41:21 compute-1 ovn_metadata_agent[104924]:     user        root
Jan 26 15:41:21 compute-1 ovn_metadata_agent[104924]:     group       root
Jan 26 15:41:21 compute-1 ovn_metadata_agent[104924]:     maxconn     1024
Jan 26 15:41:21 compute-1 ovn_metadata_agent[104924]:     pidfile     /var/lib/neutron/external/pids/d9d38847-a43b-4d1e-a0b1-c3f77a879374.pid.haproxy
Jan 26 15:41:21 compute-1 ovn_metadata_agent[104924]:     daemon
Jan 26 15:41:21 compute-1 ovn_metadata_agent[104924]: 
Jan 26 15:41:21 compute-1 ovn_metadata_agent[104924]: defaults
Jan 26 15:41:21 compute-1 ovn_metadata_agent[104924]:     log global
Jan 26 15:41:21 compute-1 ovn_metadata_agent[104924]:     mode http
Jan 26 15:41:21 compute-1 ovn_metadata_agent[104924]:     option httplog
Jan 26 15:41:21 compute-1 ovn_metadata_agent[104924]:     option dontlognull
Jan 26 15:41:21 compute-1 ovn_metadata_agent[104924]:     option http-server-close
Jan 26 15:41:21 compute-1 ovn_metadata_agent[104924]:     option forwardfor
Jan 26 15:41:21 compute-1 ovn_metadata_agent[104924]:     retries                 3
Jan 26 15:41:21 compute-1 ovn_metadata_agent[104924]:     timeout http-request    30s
Jan 26 15:41:21 compute-1 ovn_metadata_agent[104924]:     timeout connect         30s
Jan 26 15:41:21 compute-1 ovn_metadata_agent[104924]:     timeout client          32s
Jan 26 15:41:21 compute-1 ovn_metadata_agent[104924]:     timeout server          32s
Jan 26 15:41:21 compute-1 ovn_metadata_agent[104924]:     timeout http-keep-alive 30s
Jan 26 15:41:21 compute-1 ovn_metadata_agent[104924]: 
Jan 26 15:41:21 compute-1 ovn_metadata_agent[104924]: listen listener
Jan 26 15:41:21 compute-1 ovn_metadata_agent[104924]:     bind 169.254.169.254:80
Jan 26 15:41:21 compute-1 ovn_metadata_agent[104924]:     
Jan 26 15:41:21 compute-1 ovn_metadata_agent[104924]:     server metadata /var/lib/neutron/metadata_proxy
Jan 26 15:41:21 compute-1 ovn_metadata_agent[104924]: 
Jan 26 15:41:21 compute-1 ovn_metadata_agent[104924]:     http-request add-header X-OVN-Network-ID d9d38847-a43b-4d1e-a0b1-c3f77a879374
Jan 26 15:41:21 compute-1 ovn_metadata_agent[104924]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Jan 26 15:41:21 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:41:21.350 104930 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374', 'env', 'PROCESS_TAG=haproxy-d9d38847-a43b-4d1e-a0b1-c3f77a879374', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d9d38847-a43b-4d1e-a0b1-c3f77a879374.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Jan 26 15:41:21 compute-1 nova_compute[183403]: 2026-01-26 15:41:21.660 183407 DEBUG nova.compute.manager [req-a02300db-4bbe-456a-aba0-e8002f09ccd2 req-d7c3880f-dac2-4f11-b98b-b893fd9df552 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Received event network-vif-plugged-b895845f-25a0-49ab-8be5-082a63b18d3d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:41:21 compute-1 nova_compute[183403]: 2026-01-26 15:41:21.662 183407 DEBUG oslo_concurrency.lockutils [req-a02300db-4bbe-456a-aba0-e8002f09ccd2 req-d7c3880f-dac2-4f11-b98b-b893fd9df552 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:41:21 compute-1 nova_compute[183403]: 2026-01-26 15:41:21.662 183407 DEBUG oslo_concurrency.lockutils [req-a02300db-4bbe-456a-aba0-e8002f09ccd2 req-d7c3880f-dac2-4f11-b98b-b893fd9df552 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:41:21 compute-1 nova_compute[183403]: 2026-01-26 15:41:21.663 183407 DEBUG oslo_concurrency.lockutils [req-a02300db-4bbe-456a-aba0-e8002f09ccd2 req-d7c3880f-dac2-4f11-b98b-b893fd9df552 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:41:21 compute-1 nova_compute[183403]: 2026-01-26 15:41:21.663 183407 DEBUG nova.compute.manager [req-a02300db-4bbe-456a-aba0-e8002f09ccd2 req-d7c3880f-dac2-4f11-b98b-b893fd9df552 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Processing event network-vif-plugged-b895845f-25a0-49ab-8be5-082a63b18d3d _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 26 15:41:21 compute-1 nova_compute[183403]: 2026-01-26 15:41:21.716 183407 DEBUG nova.compute.manager [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 26 15:41:21 compute-1 nova_compute[183403]: 2026-01-26 15:41:21.721 183407 DEBUG nova.virt.libvirt.driver [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Jan 26 15:41:21 compute-1 nova_compute[183403]: 2026-01-26 15:41:21.726 183407 INFO nova.virt.libvirt.driver [-] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Instance spawned successfully.
Jan 26 15:41:21 compute-1 nova_compute[183403]: 2026-01-26 15:41:21.726 183407 DEBUG nova.virt.libvirt.driver [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Jan 26 15:41:21 compute-1 podman[215692]: 2026-01-26 15:41:21.711640534 +0000 UTC m=+0.019076435 image pull d5bf96c5225682608353c2a38183b39c74c7c48343b54a579b3b6f3d81996637 38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Jan 26 15:41:22 compute-1 nova_compute[183403]: 2026-01-26 15:41:22.242 183407 DEBUG nova.virt.libvirt.driver [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:41:22 compute-1 nova_compute[183403]: 2026-01-26 15:41:22.244 183407 DEBUG nova.virt.libvirt.driver [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:41:22 compute-1 nova_compute[183403]: 2026-01-26 15:41:22.245 183407 DEBUG nova.virt.libvirt.driver [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:41:22 compute-1 nova_compute[183403]: 2026-01-26 15:41:22.246 183407 DEBUG nova.virt.libvirt.driver [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:41:22 compute-1 nova_compute[183403]: 2026-01-26 15:41:22.247 183407 DEBUG nova.virt.libvirt.driver [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:41:22 compute-1 nova_compute[183403]: 2026-01-26 15:41:22.247 183407 DEBUG nova.virt.libvirt.driver [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 15:41:22 compute-1 nova_compute[183403]: 2026-01-26 15:41:22.263 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:41:22 compute-1 nova_compute[183403]: 2026-01-26 15:41:22.761 183407 INFO nova.compute.manager [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Took 10.04 seconds to spawn the instance on the hypervisor.
Jan 26 15:41:22 compute-1 nova_compute[183403]: 2026-01-26 15:41:22.762 183407 DEBUG nova.compute.manager [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Jan 26 15:41:23 compute-1 nova_compute[183403]: 2026-01-26 15:41:23.055 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:41:23 compute-1 nova_compute[183403]: 2026-01-26 15:41:23.302 183407 INFO nova.compute.manager [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Took 15.58 seconds to build instance.
Jan 26 15:41:23 compute-1 podman[215692]: 2026-01-26 15:41:23.634411921 +0000 UTC m=+1.941847822 container create aed80525937a5c00fb8d97dd13d75156f80d547c63e3606b42161a44e1c933e3 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 26 15:41:23 compute-1 nova_compute[183403]: 2026-01-26 15:41:23.724 183407 DEBUG nova.compute.manager [req-9d20ed35-e888-4dec-ada8-855d02749012 req-ba49d42f-420c-4303-a58b-02b173ad1ecc 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Received event network-vif-plugged-b895845f-25a0-49ab-8be5-082a63b18d3d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:41:23 compute-1 nova_compute[183403]: 2026-01-26 15:41:23.725 183407 DEBUG oslo_concurrency.lockutils [req-9d20ed35-e888-4dec-ada8-855d02749012 req-ba49d42f-420c-4303-a58b-02b173ad1ecc 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:41:23 compute-1 nova_compute[183403]: 2026-01-26 15:41:23.725 183407 DEBUG oslo_concurrency.lockutils [req-9d20ed35-e888-4dec-ada8-855d02749012 req-ba49d42f-420c-4303-a58b-02b173ad1ecc 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:41:23 compute-1 nova_compute[183403]: 2026-01-26 15:41:23.726 183407 DEBUG oslo_concurrency.lockutils [req-9d20ed35-e888-4dec-ada8-855d02749012 req-ba49d42f-420c-4303-a58b-02b173ad1ecc 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:41:23 compute-1 nova_compute[183403]: 2026-01-26 15:41:23.726 183407 DEBUG nova.compute.manager [req-9d20ed35-e888-4dec-ada8-855d02749012 req-ba49d42f-420c-4303-a58b-02b173ad1ecc 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] No waiting events found dispatching network-vif-plugged-b895845f-25a0-49ab-8be5-082a63b18d3d pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:41:23 compute-1 nova_compute[183403]: 2026-01-26 15:41:23.726 183407 WARNING nova.compute.manager [req-9d20ed35-e888-4dec-ada8-855d02749012 req-ba49d42f-420c-4303-a58b-02b173ad1ecc 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Received unexpected event network-vif-plugged-b895845f-25a0-49ab-8be5-082a63b18d3d for instance with vm_state active and task_state None.
Jan 26 15:41:23 compute-1 nova_compute[183403]: 2026-01-26 15:41:23.809 183407 DEBUG oslo_concurrency.lockutils [None req-d9c46636-d9f6-400a-b3b6-f91d486a2b68 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Lock "71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.107s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:41:24 compute-1 systemd[1]: Started libpod-conmon-aed80525937a5c00fb8d97dd13d75156f80d547c63e3606b42161a44e1c933e3.scope.
Jan 26 15:41:24 compute-1 systemd[1]: Started libcrun container.
Jan 26 15:41:24 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/269752733cf3f9223258ac003f284996460fff72e6d92bc82b012d41e1185d5a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 15:41:24 compute-1 podman[215692]: 2026-01-26 15:41:24.468963909 +0000 UTC m=+2.776399870 container init aed80525937a5c00fb8d97dd13d75156f80d547c63e3606b42161a44e1c933e3 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.build-date=20260120)
Jan 26 15:41:24 compute-1 podman[215692]: 2026-01-26 15:41:24.479589293 +0000 UTC m=+2.787025204 container start aed80525937a5c00fb8d97dd13d75156f80d547c63e3606b42161a44e1c933e3 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2)
Jan 26 15:41:24 compute-1 neutron-haproxy-ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374[215708]: [NOTICE]   (215731) : New worker (215733) forked
Jan 26 15:41:24 compute-1 neutron-haproxy-ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374[215708]: [NOTICE]   (215731) : Loading success.
Jan 26 15:41:24 compute-1 podman[215711]: 2026-01-26 15:41:24.672519131 +0000 UTC m=+0.498633477 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest, tcib_managed=true)
Jan 26 15:41:24 compute-1 podman[215709]: 2026-01-26 15:41:24.730119133 +0000 UTC m=+0.556192348 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4)
Jan 26 15:41:26 compute-1 nova_compute[183403]: 2026-01-26 15:41:26.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:41:27 compute-1 nova_compute[183403]: 2026-01-26 15:41:27.265 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:41:28 compute-1 nova_compute[183403]: 2026-01-26 15:41:28.058 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:41:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:41:29.101 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:41:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:41:29.101 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:41:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:41:29.102 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:41:30 compute-1 nova_compute[183403]: 2026-01-26 15:41:30.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:41:31 compute-1 nova_compute[183403]: 2026-01-26 15:41:31.179 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:41:31 compute-1 nova_compute[183403]: 2026-01-26 15:41:31.179 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:41:31 compute-1 nova_compute[183403]: 2026-01-26 15:41:31.179 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:41:31 compute-1 nova_compute[183403]: 2026-01-26 15:41:31.180 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:41:32 compute-1 nova_compute[183403]: 2026-01-26 15:41:32.249 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:41:32 compute-1 nova_compute[183403]: 2026-01-26 15:41:32.267 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:41:32 compute-1 nova_compute[183403]: 2026-01-26 15:41:32.308 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:41:32 compute-1 nova_compute[183403]: 2026-01-26 15:41:32.309 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:41:32 compute-1 nova_compute[183403]: 2026-01-26 15:41:32.359 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:41:32 compute-1 nova_compute[183403]: 2026-01-26 15:41:32.528 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:41:32 compute-1 nova_compute[183403]: 2026-01-26 15:41:32.529 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:41:32 compute-1 nova_compute[183403]: 2026-01-26 15:41:32.557 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.028s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:41:32 compute-1 nova_compute[183403]: 2026-01-26 15:41:32.558 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5604MB free_disk=73.14387130737305GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:41:32 compute-1 nova_compute[183403]: 2026-01-26 15:41:32.559 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:41:32 compute-1 nova_compute[183403]: 2026-01-26 15:41:32.559 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:41:33 compute-1 nova_compute[183403]: 2026-01-26 15:41:33.062 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:41:33 compute-1 nova_compute[183403]: 2026-01-26 15:41:33.706 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Instance 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 15:41:34 compute-1 nova_compute[183403]: 2026-01-26 15:41:34.219 183407 WARNING nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Instance 945ea863-2fb6-4499-8ade-804189f3375f has been moved to another host compute-0.ctlplane.example.com(compute-0.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Jan 26 15:41:34 compute-1 nova_compute[183403]: 2026-01-26 15:41:34.220 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:41:34 compute-1 nova_compute[183403]: 2026-01-26 15:41:34.220 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:41:32 up  1:36,  0 user,  load average: 0.43, 0.23, 0.20\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_af8eea38f1d74ad1a01087c020ea8d02': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:41:34 compute-1 nova_compute[183403]: 2026-01-26 15:41:34.280 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:41:34 compute-1 nova_compute[183403]: 2026-01-26 15:41:34.789 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:41:34 compute-1 nova_compute[183403]: 2026-01-26 15:41:34.956 183407 DEBUG nova.virt.libvirt.driver [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 945ea863-2fb6-4499-8ade-804189f3375f] Creating tmpfile /var/lib/nova/instances/tmprzzsoqqi to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Jan 26 15:41:34 compute-1 nova_compute[183403]: 2026-01-26 15:41:34.958 183407 WARNING neutronclient.v2_0.client [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:41:35 compute-1 nova_compute[183403]: 2026-01-26 15:41:35.040 183407 DEBUG nova.compute.manager [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmprzzsoqqi',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9090
Jan 26 15:41:35 compute-1 ovn_controller[95641]: 2026-01-26T15:41:35Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:22:59:61 10.100.0.9
Jan 26 15:41:35 compute-1 ovn_controller[95641]: 2026-01-26T15:41:35Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:22:59:61 10.100.0.9
Jan 26 15:41:35 compute-1 nova_compute[183403]: 2026-01-26 15:41:35.304 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:41:35 compute-1 nova_compute[183403]: 2026-01-26 15:41:35.305 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.746s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:41:35 compute-1 nova_compute[183403]: 2026-01-26 15:41:35.305 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:41:35 compute-1 nova_compute[183403]: 2026-01-26 15:41:35.306 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Jan 26 15:41:35 compute-1 podman[192725]: time="2026-01-26T15:41:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:41:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:41:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16575 "" "Go-http-client/1.1"
Jan 26 15:41:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:41:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2658 "" "Go-http-client/1.1"
Jan 26 15:41:37 compute-1 nova_compute[183403]: 2026-01-26 15:41:37.066 183407 WARNING neutronclient.v2_0.client [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:41:37 compute-1 nova_compute[183403]: 2026-01-26 15:41:37.269 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:41:38 compute-1 nova_compute[183403]: 2026-01-26 15:41:38.067 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:41:38 compute-1 nova_compute[183403]: 2026-01-26 15:41:38.812 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:41:38 compute-1 nova_compute[183403]: 2026-01-26 15:41:38.813 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:41:38 compute-1 nova_compute[183403]: 2026-01-26 15:41:38.814 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:41:38 compute-1 nova_compute[183403]: 2026-01-26 15:41:38.814 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:41:38 compute-1 nova_compute[183403]: 2026-01-26 15:41:38.815 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:41:38 compute-1 nova_compute[183403]: 2026-01-26 15:41:38.815 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:41:38 compute-1 nova_compute[183403]: 2026-01-26 15:41:38.816 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 15:41:41 compute-1 nova_compute[183403]: 2026-01-26 15:41:41.577 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:41:41 compute-1 nova_compute[183403]: 2026-01-26 15:41:41.577 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Jan 26 15:41:42 compute-1 nova_compute[183403]: 2026-01-26 15:41:42.084 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Jan 26 15:41:42 compute-1 nova_compute[183403]: 2026-01-26 15:41:42.271 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:41:42 compute-1 nova_compute[183403]: 2026-01-26 15:41:42.552 183407 DEBUG nova.compute.manager [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmprzzsoqqi',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='945ea863-2fb6-4499-8ade-804189f3375f',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9315
Jan 26 15:41:43 compute-1 nova_compute[183403]: 2026-01-26 15:41:43.070 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:41:43 compute-1 nova_compute[183403]: 2026-01-26 15:41:43.570 183407 DEBUG oslo_concurrency.lockutils [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "refresh_cache-945ea863-2fb6-4499-8ade-804189f3375f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 15:41:43 compute-1 nova_compute[183403]: 2026-01-26 15:41:43.571 183407 DEBUG oslo_concurrency.lockutils [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquired lock "refresh_cache-945ea863-2fb6-4499-8ade-804189f3375f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 15:41:43 compute-1 nova_compute[183403]: 2026-01-26 15:41:43.572 183407 DEBUG nova.network.neutron [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 945ea863-2fb6-4499-8ade-804189f3375f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 15:41:44 compute-1 nova_compute[183403]: 2026-01-26 15:41:44.164 183407 WARNING neutronclient.v2_0.client [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:41:44 compute-1 nova_compute[183403]: 2026-01-26 15:41:44.867 183407 WARNING neutronclient.v2_0.client [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:41:44 compute-1 podman[215793]: 2026-01-26 15:41:44.891103795 +0000 UTC m=+0.057999363 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.expose-services=, version=9.6, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., distribution-scope=public, architecture=x86_64, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, release=1755695350, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter)
Jan 26 15:41:44 compute-1 podman[215792]: 2026-01-26 15:41:44.895063228 +0000 UTC m=+0.062766347 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 15:41:45 compute-1 nova_compute[183403]: 2026-01-26 15:41:45.021 183407 DEBUG nova.network.neutron [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 945ea863-2fb6-4499-8ade-804189f3375f] Updating instance_info_cache with network_info: [{"id": "a2fc018e-f92a-4576-b5f0-2d9d66e80c23", "address": "fa:16:3e:fa:22:00", "network": {"id": "d9d38847-a43b-4d1e-a0b1-c3f77a879374", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1542262107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "312beda09adb420b9f44a490d7257008", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2fc018e-f9", "ovs_interfaceid": "a2fc018e-f92a-4576-b5f0-2d9d66e80c23", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:41:45 compute-1 nova_compute[183403]: 2026-01-26 15:41:45.640 183407 DEBUG oslo_concurrency.lockutils [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Releasing lock "refresh_cache-945ea863-2fb6-4499-8ade-804189f3375f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 15:41:45 compute-1 nova_compute[183403]: 2026-01-26 15:41:45.716 183407 DEBUG nova.virt.libvirt.driver [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 945ea863-2fb6-4499-8ade-804189f3375f] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmprzzsoqqi',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='945ea863-2fb6-4499-8ade-804189f3375f',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Jan 26 15:41:45 compute-1 nova_compute[183403]: 2026-01-26 15:41:45.717 183407 DEBUG nova.virt.libvirt.driver [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 945ea863-2fb6-4499-8ade-804189f3375f] Creating instance directory: /var/lib/nova/instances/945ea863-2fb6-4499-8ade-804189f3375f pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Jan 26 15:41:45 compute-1 nova_compute[183403]: 2026-01-26 15:41:45.718 183407 DEBUG nova.virt.libvirt.driver [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 945ea863-2fb6-4499-8ade-804189f3375f] Creating disk.info with the contents: {'/var/lib/nova/instances/945ea863-2fb6-4499-8ade-804189f3375f/disk': 'qcow2', '/var/lib/nova/instances/945ea863-2fb6-4499-8ade-804189f3375f/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Jan 26 15:41:45 compute-1 nova_compute[183403]: 2026-01-26 15:41:45.718 183407 DEBUG nova.virt.libvirt.driver [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 945ea863-2fb6-4499-8ade-804189f3375f] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Jan 26 15:41:45 compute-1 nova_compute[183403]: 2026-01-26 15:41:45.719 183407 DEBUG nova.objects.instance [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lazy-loading 'trusted_certs' on Instance uuid 945ea863-2fb6-4499-8ade-804189f3375f obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:41:46 compute-1 nova_compute[183403]: 2026-01-26 15:41:46.234 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:41:46 compute-1 nova_compute[183403]: 2026-01-26 15:41:46.240 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:41:46 compute-1 nova_compute[183403]: 2026-01-26 15:41:46.242 183407 DEBUG oslo_concurrency.processutils [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:41:46 compute-1 nova_compute[183403]: 2026-01-26 15:41:46.301 183407 DEBUG oslo_concurrency.processutils [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:41:46 compute-1 nova_compute[183403]: 2026-01-26 15:41:46.303 183407 DEBUG oslo_concurrency.lockutils [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:41:46 compute-1 nova_compute[183403]: 2026-01-26 15:41:46.303 183407 DEBUG oslo_concurrency.lockutils [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:41:46 compute-1 nova_compute[183403]: 2026-01-26 15:41:46.304 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:41:46 compute-1 nova_compute[183403]: 2026-01-26 15:41:46.308 183407 DEBUG oslo_utils.imageutils.format_inspector [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 15:41:46 compute-1 nova_compute[183403]: 2026-01-26 15:41:46.308 183407 DEBUG oslo_concurrency.processutils [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:41:46 compute-1 nova_compute[183403]: 2026-01-26 15:41:46.374 183407 DEBUG oslo_concurrency.processutils [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:41:46 compute-1 nova_compute[183403]: 2026-01-26 15:41:46.375 183407 DEBUG oslo_concurrency.processutils [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0,backing_fmt=raw /var/lib/nova/instances/945ea863-2fb6-4499-8ade-804189f3375f/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:41:46 compute-1 nova_compute[183403]: 2026-01-26 15:41:46.654 183407 DEBUG oslo_concurrency.processutils [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0,backing_fmt=raw /var/lib/nova/instances/945ea863-2fb6-4499-8ade-804189f3375f/disk 1073741824" returned: 0 in 0.279s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:41:46 compute-1 nova_compute[183403]: 2026-01-26 15:41:46.655 183407 DEBUG oslo_concurrency.lockutils [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "d88b40040782a1a5d836951a799a7b8c2aa335f0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.352s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:41:46 compute-1 nova_compute[183403]: 2026-01-26 15:41:46.656 183407 DEBUG oslo_concurrency.processutils [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:41:46 compute-1 nova_compute[183403]: 2026-01-26 15:41:46.721 183407 DEBUG oslo_concurrency.processutils [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d88b40040782a1a5d836951a799a7b8c2aa335f0 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:41:46 compute-1 nova_compute[183403]: 2026-01-26 15:41:46.722 183407 DEBUG nova.virt.disk.api [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Checking if we can resize image /var/lib/nova/instances/945ea863-2fb6-4499-8ade-804189f3375f/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 26 15:41:46 compute-1 nova_compute[183403]: 2026-01-26 15:41:46.722 183407 DEBUG oslo_concurrency.processutils [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/945ea863-2fb6-4499-8ade-804189f3375f/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:41:46 compute-1 nova_compute[183403]: 2026-01-26 15:41:46.776 183407 DEBUG oslo_concurrency.processutils [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/945ea863-2fb6-4499-8ade-804189f3375f/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:41:46 compute-1 nova_compute[183403]: 2026-01-26 15:41:46.777 183407 DEBUG nova.virt.disk.api [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Cannot resize image /var/lib/nova/instances/945ea863-2fb6-4499-8ade-804189f3375f/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 26 15:41:46 compute-1 nova_compute[183403]: 2026-01-26 15:41:46.777 183407 DEBUG nova.objects.instance [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lazy-loading 'migration_context' on Instance uuid 945ea863-2fb6-4499-8ade-804189f3375f obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:41:47 compute-1 nova_compute[183403]: 2026-01-26 15:41:47.274 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:41:47 compute-1 nova_compute[183403]: 2026-01-26 15:41:47.285 183407 DEBUG nova.objects.base [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Object Instance<945ea863-2fb6-4499-8ade-804189f3375f> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Jan 26 15:41:47 compute-1 nova_compute[183403]: 2026-01-26 15:41:47.285 183407 DEBUG oslo_concurrency.processutils [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/945ea863-2fb6-4499-8ade-804189f3375f/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:41:47 compute-1 nova_compute[183403]: 2026-01-26 15:41:47.314 183407 DEBUG oslo_concurrency.processutils [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/945ea863-2fb6-4499-8ade-804189f3375f/disk.config 497664" returned: 0 in 0.029s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:41:47 compute-1 nova_compute[183403]: 2026-01-26 15:41:47.315 183407 DEBUG nova.virt.libvirt.driver [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 945ea863-2fb6-4499-8ade-804189f3375f] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Jan 26 15:41:47 compute-1 nova_compute[183403]: 2026-01-26 15:41:47.316 183407 DEBUG nova.virt.libvirt.vif [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-26T15:40:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-1953533895',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-1953533895',id=32,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T15:41:00Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='af8eea38f1d74ad1a01087c020ea8d02',ramdisk_id='',reservation_id='r-4n2bj1fm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-1233966703',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-1233966703-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T15:41:00Z,user_data=None,user_id='136d3cfdd6cb48e2ab65221bcc05d26c',uuid=945ea863-2fb6-4499-8ade-804189f3375f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a2fc018e-f92a-4576-b5f0-2d9d66e80c23", "address": "fa:16:3e:fa:22:00", "network": {"id": "d9d38847-a43b-4d1e-a0b1-c3f77a879374", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1542262107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "312beda09adb420b9f44a490d7257008", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapa2fc018e-f9", "ovs_interfaceid": "a2fc018e-f92a-4576-b5f0-2d9d66e80c23", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 26 15:41:47 compute-1 nova_compute[183403]: 2026-01-26 15:41:47.316 183407 DEBUG nova.network.os_vif_util [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Converting VIF {"id": "a2fc018e-f92a-4576-b5f0-2d9d66e80c23", "address": "fa:16:3e:fa:22:00", "network": {"id": "d9d38847-a43b-4d1e-a0b1-c3f77a879374", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1542262107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "312beda09adb420b9f44a490d7257008", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapa2fc018e-f9", "ovs_interfaceid": "a2fc018e-f92a-4576-b5f0-2d9d66e80c23", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:41:47 compute-1 nova_compute[183403]: 2026-01-26 15:41:47.317 183407 DEBUG nova.network.os_vif_util [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fa:22:00,bridge_name='br-int',has_traffic_filtering=True,id=a2fc018e-f92a-4576-b5f0-2d9d66e80c23,network=Network(d9d38847-a43b-4d1e-a0b1-c3f77a879374),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2fc018e-f9') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:41:47 compute-1 nova_compute[183403]: 2026-01-26 15:41:47.318 183407 DEBUG os_vif [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fa:22:00,bridge_name='br-int',has_traffic_filtering=True,id=a2fc018e-f92a-4576-b5f0-2d9d66e80c23,network=Network(d9d38847-a43b-4d1e-a0b1-c3f77a879374),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2fc018e-f9') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 26 15:41:47 compute-1 nova_compute[183403]: 2026-01-26 15:41:47.318 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:41:47 compute-1 nova_compute[183403]: 2026-01-26 15:41:47.319 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:41:47 compute-1 nova_compute[183403]: 2026-01-26 15:41:47.319 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:41:47 compute-1 nova_compute[183403]: 2026-01-26 15:41:47.320 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:41:47 compute-1 nova_compute[183403]: 2026-01-26 15:41:47.320 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '94672caa-c0a1-5178-b5f7-b8a72d4ea714', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:41:47 compute-1 nova_compute[183403]: 2026-01-26 15:41:47.323 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:41:47 compute-1 nova_compute[183403]: 2026-01-26 15:41:47.329 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:41:47 compute-1 nova_compute[183403]: 2026-01-26 15:41:47.329 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa2fc018e-f9, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:41:47 compute-1 nova_compute[183403]: 2026-01-26 15:41:47.330 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapa2fc018e-f9, col_values=(('qos', UUID('642f78f8-54ac-4c17-9f46-b81b44edf90b')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:41:47 compute-1 nova_compute[183403]: 2026-01-26 15:41:47.330 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapa2fc018e-f9, col_values=(('external_ids', {'iface-id': 'a2fc018e-f92a-4576-b5f0-2d9d66e80c23', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fa:22:00', 'vm-uuid': '945ea863-2fb6-4499-8ade-804189f3375f'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:41:47 compute-1 NetworkManager[55716]: <info>  [1769442107.3340] manager: (tapa2fc018e-f9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/97)
Jan 26 15:41:47 compute-1 nova_compute[183403]: 2026-01-26 15:41:47.333 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 15:41:47 compute-1 nova_compute[183403]: 2026-01-26 15:41:47.342 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:41:47 compute-1 nova_compute[183403]: 2026-01-26 15:41:47.343 183407 INFO os_vif [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fa:22:00,bridge_name='br-int',has_traffic_filtering=True,id=a2fc018e-f92a-4576-b5f0-2d9d66e80c23,network=Network(d9d38847-a43b-4d1e-a0b1-c3f77a879374),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2fc018e-f9')
Jan 26 15:41:47 compute-1 nova_compute[183403]: 2026-01-26 15:41:47.343 183407 DEBUG nova.virt.libvirt.driver [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Jan 26 15:41:47 compute-1 nova_compute[183403]: 2026-01-26 15:41:47.344 183407 DEBUG nova.compute.manager [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmprzzsoqqi',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='945ea863-2fb6-4499-8ade-804189f3375f',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9381
Jan 26 15:41:47 compute-1 nova_compute[183403]: 2026-01-26 15:41:47.344 183407 WARNING neutronclient.v2_0.client [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:41:47 compute-1 nova_compute[183403]: 2026-01-26 15:41:47.536 183407 WARNING neutronclient.v2_0.client [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:41:48 compute-1 nova_compute[183403]: 2026-01-26 15:41:48.225 183407 DEBUG nova.network.neutron [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 945ea863-2fb6-4499-8ade-804189f3375f] Port a2fc018e-f92a-4576-b5f0-2d9d66e80c23 updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Jan 26 15:41:48 compute-1 nova_compute[183403]: 2026-01-26 15:41:48.236 183407 DEBUG nova.compute.manager [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmprzzsoqqi',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='945ea863-2fb6-4499-8ade-804189f3375f',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9447
Jan 26 15:41:49 compute-1 openstack_network_exporter[195610]: ERROR   15:41:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:41:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:41:49 compute-1 openstack_network_exporter[195610]: ERROR   15:41:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:41:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:41:50 compute-1 ovn_controller[95641]: 2026-01-26T15:41:50Z|00259|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Jan 26 15:41:51 compute-1 kernel: tapa2fc018e-f9: entered promiscuous mode
Jan 26 15:41:51 compute-1 NetworkManager[55716]: <info>  [1769442111.7108] manager: (tapa2fc018e-f9): new Tun device (/org/freedesktop/NetworkManager/Devices/98)
Jan 26 15:41:51 compute-1 nova_compute[183403]: 2026-01-26 15:41:51.710 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:41:51 compute-1 ovn_controller[95641]: 2026-01-26T15:41:51Z|00260|binding|INFO|Claiming lport a2fc018e-f92a-4576-b5f0-2d9d66e80c23 for this additional chassis.
Jan 26 15:41:51 compute-1 ovn_controller[95641]: 2026-01-26T15:41:51Z|00261|binding|INFO|a2fc018e-f92a-4576-b5f0-2d9d66e80c23: Claiming fa:16:3e:fa:22:00 10.100.0.7
Jan 26 15:41:51 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:41:51.721 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fa:22:00 10.100.0.7'], port_security=['fa:16:3e:fa:22:00 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '945ea863-2fb6-4499-8ade-804189f3375f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d9d38847-a43b-4d1e-a0b1-c3f77a879374', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'af8eea38f1d74ad1a01087c020ea8d02', 'neutron:revision_number': '10', 'neutron:security_group_ids': '41d84080-c1ba-42a6-8417-b57b30232ea3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f2b7e03b-5278-4177-91b7-862e57a7c9ad, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=a2fc018e-f92a-4576-b5f0-2d9d66e80c23) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:41:51 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:41:51.722 104930 INFO neutron.agent.ovn.metadata.agent [-] Port a2fc018e-f92a-4576-b5f0-2d9d66e80c23 in datapath d9d38847-a43b-4d1e-a0b1-c3f77a879374 unbound from our chassis
Jan 26 15:41:51 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:41:51.724 104930 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d9d38847-a43b-4d1e-a0b1-c3f77a879374
Jan 26 15:41:51 compute-1 nova_compute[183403]: 2026-01-26 15:41:51.735 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:41:51 compute-1 ovn_controller[95641]: 2026-01-26T15:41:51Z|00262|binding|INFO|Setting lport a2fc018e-f92a-4576-b5f0-2d9d66e80c23 ovn-installed in OVS
Jan 26 15:41:51 compute-1 nova_compute[183403]: 2026-01-26 15:41:51.737 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:41:51 compute-1 nova_compute[183403]: 2026-01-26 15:41:51.740 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:41:51 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:41:51.745 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[c65db179-7d29-4d29-b4bf-28e8fde3387b]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:41:51 compute-1 systemd-udevd[215872]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 15:41:51 compute-1 NetworkManager[55716]: <info>  [1769442111.7642] device (tapa2fc018e-f9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 15:41:51 compute-1 NetworkManager[55716]: <info>  [1769442111.7651] device (tapa2fc018e-f9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 15:41:51 compute-1 systemd-machined[154697]: New machine qemu-25-instance-00000020.
Jan 26 15:41:51 compute-1 systemd[1]: Started Virtual Machine qemu-25-instance-00000020.
Jan 26 15:41:51 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:41:51.792 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[4b272487-dd34-42dc-8378-9c2dcdd3f8d0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:41:51 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:41:51.796 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[8369375d-4854-4b2b-bc98-2a2a4fb064e1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:41:51 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:41:51.830 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[287b6dd8-539d-4aea-86e4-79cbc92cc871]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:41:51 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:41:51.850 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[5d0fc8ae-8240-4bf8-b111-79e4a48d1aa5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd9d38847-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:38:11:27'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 71], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 580364, 'reachable_time': 38520, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215885, 'error': None, 'target': 'ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:41:51 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:41:51.867 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[59a7124e-5f2c-4808-acf2-a0b9ae7984fd]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd9d38847-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 580383, 'tstamp': 580383}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215887, 'error': None, 'target': 'ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd9d38847-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 580387, 'tstamp': 580387}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215887, 'error': None, 'target': 'ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:41:51 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:41:51.869 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd9d38847-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:41:51 compute-1 nova_compute[183403]: 2026-01-26 15:41:51.871 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:41:51 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:41:51.872 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd9d38847-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:41:51 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:41:51.873 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:41:51 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:41:51.873 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd9d38847-a0, col_values=(('external_ids', {'iface-id': '70e6ec0a-21db-4d83-b0cb-0624424ede18'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:41:51 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:41:51.873 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:41:51 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:41:51.874 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[a8f80850-d3bc-4bdd-9dde-73a98708de6c]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-d9d38847-a43b-4d1e-a0b1-c3f77a879374\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/d9d38847-a43b-4d1e-a0b1-c3f77a879374.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID d9d38847-a43b-4d1e-a0b1-c3f77a879374\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:41:52 compute-1 nova_compute[183403]: 2026-01-26 15:41:52.276 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:41:52 compute-1 nova_compute[183403]: 2026-01-26 15:41:52.331 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:41:54 compute-1 podman[215909]: 2026-01-26 15:41:54.919602301 +0000 UTC m=+0.083224517 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 26 15:41:54 compute-1 podman[215908]: 2026-01-26 15:41:54.928706566 +0000 UTC m=+0.092905377 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.4)
Jan 26 15:41:55 compute-1 ovn_controller[95641]: 2026-01-26T15:41:55Z|00263|binding|INFO|Claiming lport a2fc018e-f92a-4576-b5f0-2d9d66e80c23 for this chassis.
Jan 26 15:41:55 compute-1 ovn_controller[95641]: 2026-01-26T15:41:55Z|00264|binding|INFO|a2fc018e-f92a-4576-b5f0-2d9d66e80c23: Claiming fa:16:3e:fa:22:00 10.100.0.7
Jan 26 15:41:55 compute-1 ovn_controller[95641]: 2026-01-26T15:41:55Z|00265|binding|INFO|Setting lport a2fc018e-f92a-4576-b5f0-2d9d66e80c23 up in Southbound
Jan 26 15:41:56 compute-1 nova_compute[183403]: 2026-01-26 15:41:56.985 183407 INFO nova.compute.manager [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 945ea863-2fb6-4499-8ade-804189f3375f] Post operation of migration started
Jan 26 15:41:56 compute-1 nova_compute[183403]: 2026-01-26 15:41:56.986 183407 WARNING neutronclient.v2_0.client [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:41:57 compute-1 nova_compute[183403]: 2026-01-26 15:41:57.279 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:41:57 compute-1 nova_compute[183403]: 2026-01-26 15:41:57.333 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:41:57 compute-1 nova_compute[183403]: 2026-01-26 15:41:57.602 183407 WARNING neutronclient.v2_0.client [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:41:57 compute-1 nova_compute[183403]: 2026-01-26 15:41:57.603 183407 WARNING neutronclient.v2_0.client [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:41:57 compute-1 nova_compute[183403]: 2026-01-26 15:41:57.751 183407 DEBUG oslo_concurrency.lockutils [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "refresh_cache-945ea863-2fb6-4499-8ade-804189f3375f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 15:41:57 compute-1 nova_compute[183403]: 2026-01-26 15:41:57.752 183407 DEBUG oslo_concurrency.lockutils [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquired lock "refresh_cache-945ea863-2fb6-4499-8ade-804189f3375f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 15:41:57 compute-1 nova_compute[183403]: 2026-01-26 15:41:57.752 183407 DEBUG nova.network.neutron [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 945ea863-2fb6-4499-8ade-804189f3375f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 15:41:58 compute-1 nova_compute[183403]: 2026-01-26 15:41:58.258 183407 WARNING neutronclient.v2_0.client [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:41:58 compute-1 nova_compute[183403]: 2026-01-26 15:41:58.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:41:58 compute-1 nova_compute[183403]: 2026-01-26 15:41:58.701 183407 WARNING neutronclient.v2_0.client [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:41:58 compute-1 nova_compute[183403]: 2026-01-26 15:41:58.884 183407 DEBUG nova.network.neutron [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 945ea863-2fb6-4499-8ade-804189f3375f] Updating instance_info_cache with network_info: [{"id": "a2fc018e-f92a-4576-b5f0-2d9d66e80c23", "address": "fa:16:3e:fa:22:00", "network": {"id": "d9d38847-a43b-4d1e-a0b1-c3f77a879374", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1542262107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "312beda09adb420b9f44a490d7257008", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2fc018e-f9", "ovs_interfaceid": "a2fc018e-f92a-4576-b5f0-2d9d66e80c23", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:41:59 compute-1 nova_compute[183403]: 2026-01-26 15:41:59.415 183407 DEBUG oslo_concurrency.lockutils [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Releasing lock "refresh_cache-945ea863-2fb6-4499-8ade-804189f3375f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 15:42:00 compute-1 nova_compute[183403]: 2026-01-26 15:42:00.054 183407 DEBUG oslo_concurrency.lockutils [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:42:00 compute-1 nova_compute[183403]: 2026-01-26 15:42:00.054 183407 DEBUG oslo_concurrency.lockutils [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:42:00 compute-1 nova_compute[183403]: 2026-01-26 15:42:00.055 183407 DEBUG oslo_concurrency.lockutils [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:42:00 compute-1 nova_compute[183403]: 2026-01-26 15:42:00.059 183407 INFO nova.virt.libvirt.driver [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 945ea863-2fb6-4499-8ade-804189f3375f] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Jan 26 15:42:00 compute-1 virtqemud[183290]: Domain id=25 name='instance-00000020' uuid=945ea863-2fb6-4499-8ade-804189f3375f is tainted: custom-monitor
Jan 26 15:42:01 compute-1 nova_compute[183403]: 2026-01-26 15:42:01.067 183407 INFO nova.virt.libvirt.driver [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 945ea863-2fb6-4499-8ade-804189f3375f] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Jan 26 15:42:02 compute-1 nova_compute[183403]: 2026-01-26 15:42:02.076 183407 INFO nova.virt.libvirt.driver [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 945ea863-2fb6-4499-8ade-804189f3375f] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Jan 26 15:42:02 compute-1 nova_compute[183403]: 2026-01-26 15:42:02.081 183407 DEBUG nova.compute.manager [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 945ea863-2fb6-4499-8ade-804189f3375f] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Jan 26 15:42:02 compute-1 nova_compute[183403]: 2026-01-26 15:42:02.282 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:42:02 compute-1 nova_compute[183403]: 2026-01-26 15:42:02.335 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:42:02 compute-1 nova_compute[183403]: 2026-01-26 15:42:02.594 183407 DEBUG nova.objects.instance [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 945ea863-2fb6-4499-8ade-804189f3375f] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Jan 26 15:42:03 compute-1 nova_compute[183403]: 2026-01-26 15:42:03.614 183407 WARNING neutronclient.v2_0.client [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:42:03 compute-1 nova_compute[183403]: 2026-01-26 15:42:03.695 183407 WARNING neutronclient.v2_0.client [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:42:03 compute-1 nova_compute[183403]: 2026-01-26 15:42:03.696 183407 WARNING neutronclient.v2_0.client [None req-a653e10b-fb00-44d2-9eda-68abf9a094fa a70278f05b6c45b1a7444c22be9d79c2 efd5236a68844b42ac3b0c5bd372db1e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:42:04 compute-1 sshd-session[215952]: Invalid user config from 176.120.22.13 port 27012
Jan 26 15:42:05 compute-1 podman[192725]: time="2026-01-26T15:42:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:42:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:42:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16575 "" "Go-http-client/1.1"
Jan 26 15:42:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:42:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2661 "" "Go-http-client/1.1"
Jan 26 15:42:05 compute-1 sshd-session[215952]: Connection reset by invalid user config 176.120.22.13 port 27012 [preauth]
Jan 26 15:42:07 compute-1 nova_compute[183403]: 2026-01-26 15:42:07.284 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:42:07 compute-1 nova_compute[183403]: 2026-01-26 15:42:07.337 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:42:07 compute-1 sshd-session[215954]: Invalid user supervisor from 176.120.22.13 port 27016
Jan 26 15:42:08 compute-1 sshd-session[215954]: Connection reset by invalid user supervisor 176.120.22.13 port 27016 [preauth]
Jan 26 15:42:10 compute-1 sshd-session[215956]: Invalid user user1 from 176.120.22.13 port 27030
Jan 26 15:42:10 compute-1 sshd-session[215956]: Connection reset by invalid user user1 176.120.22.13 port 27030 [preauth]
Jan 26 15:42:12 compute-1 sshd-session[215958]: Invalid user mysql from 176.120.22.13 port 27048
Jan 26 15:42:12 compute-1 nova_compute[183403]: 2026-01-26 15:42:12.286 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:42:12 compute-1 nova_compute[183403]: 2026-01-26 15:42:12.338 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:42:12 compute-1 sshd-session[215958]: Connection reset by invalid user mysql 176.120.22.13 port 27048 [preauth]
Jan 26 15:42:15 compute-1 sshd-session[215960]: Connection reset by authenticating user ftp 176.120.22.13 port 56494 [preauth]
Jan 26 15:42:15 compute-1 podman[215962]: 2026-01-26 15:42:15.89703675 +0000 UTC m=+0.075755513 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 15:42:15 compute-1 podman[215963]: 2026-01-26 15:42:15.900071267 +0000 UTC m=+0.071594068 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, maintainer=Red Hat, Inc., architecture=x86_64, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350)
Jan 26 15:42:17 compute-1 nova_compute[183403]: 2026-01-26 15:42:17.288 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:42:17 compute-1 nova_compute[183403]: 2026-01-26 15:42:17.340 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:42:17 compute-1 nova_compute[183403]: 2026-01-26 15:42:17.416 183407 DEBUG oslo_concurrency.lockutils [None req-04f4a74b-8ee1-4a63-8497-de7dd5787f04 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Acquiring lock "71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:42:17 compute-1 nova_compute[183403]: 2026-01-26 15:42:17.416 183407 DEBUG oslo_concurrency.lockutils [None req-04f4a74b-8ee1-4a63-8497-de7dd5787f04 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Lock "71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:42:17 compute-1 nova_compute[183403]: 2026-01-26 15:42:17.416 183407 DEBUG oslo_concurrency.lockutils [None req-04f4a74b-8ee1-4a63-8497-de7dd5787f04 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Acquiring lock "71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:42:17 compute-1 nova_compute[183403]: 2026-01-26 15:42:17.417 183407 DEBUG oslo_concurrency.lockutils [None req-04f4a74b-8ee1-4a63-8497-de7dd5787f04 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Lock "71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:42:17 compute-1 nova_compute[183403]: 2026-01-26 15:42:17.417 183407 DEBUG oslo_concurrency.lockutils [None req-04f4a74b-8ee1-4a63-8497-de7dd5787f04 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Lock "71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:42:17 compute-1 nova_compute[183403]: 2026-01-26 15:42:17.442 183407 INFO nova.compute.manager [None req-04f4a74b-8ee1-4a63-8497-de7dd5787f04 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Terminating instance
Jan 26 15:42:17 compute-1 nova_compute[183403]: 2026-01-26 15:42:17.962 183407 DEBUG nova.compute.manager [None req-04f4a74b-8ee1-4a63-8497-de7dd5787f04 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Jan 26 15:42:18 compute-1 kernel: tapb895845f-25 (unregistering): left promiscuous mode
Jan 26 15:42:18 compute-1 NetworkManager[55716]: <info>  [1769442138.1082] device (tapb895845f-25): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 15:42:18 compute-1 ovn_controller[95641]: 2026-01-26T15:42:18Z|00266|binding|INFO|Releasing lport b895845f-25a0-49ab-8be5-082a63b18d3d from this chassis (sb_readonly=0)
Jan 26 15:42:18 compute-1 ovn_controller[95641]: 2026-01-26T15:42:18Z|00267|binding|INFO|Setting lport b895845f-25a0-49ab-8be5-082a63b18d3d down in Southbound
Jan 26 15:42:18 compute-1 ovn_controller[95641]: 2026-01-26T15:42:18Z|00268|binding|INFO|Removing iface tapb895845f-25 ovn-installed in OVS
Jan 26 15:42:18 compute-1 nova_compute[183403]: 2026-01-26 15:42:18.123 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:42:18 compute-1 nova_compute[183403]: 2026-01-26 15:42:18.125 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:42:18 compute-1 nova_compute[183403]: 2026-01-26 15:42:18.145 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:42:18 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:42:18.158 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:22:59:61 10.100.0.9'], port_security=['fa:16:3e:22:59:61 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d9d38847-a43b-4d1e-a0b1-c3f77a879374', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'af8eea38f1d74ad1a01087c020ea8d02', 'neutron:revision_number': '5', 'neutron:security_group_ids': '41d84080-c1ba-42a6-8417-b57b30232ea3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f2b7e03b-5278-4177-91b7-862e57a7c9ad, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], logical_port=b895845f-25a0-49ab-8be5-082a63b18d3d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:42:18 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:42:18.160 104930 INFO neutron.agent.ovn.metadata.agent [-] Port b895845f-25a0-49ab-8be5-082a63b18d3d in datapath d9d38847-a43b-4d1e-a0b1-c3f77a879374 unbound from our chassis
Jan 26 15:42:18 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:42:18.161 104930 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d9d38847-a43b-4d1e-a0b1-c3f77a879374
Jan 26 15:42:18 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:42:18.181 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[c01911a0-f9e9-4690-beda-5de8cc57241d]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:42:18 compute-1 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d00000021.scope: Deactivated successfully.
Jan 26 15:42:18 compute-1 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d00000021.scope: Consumed 16.238s CPU time.
Jan 26 15:42:18 compute-1 systemd-machined[154697]: Machine qemu-24-instance-00000021 terminated.
Jan 26 15:42:18 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:42:18.215 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[02f95d8f-8c9b-4ec1-b977-389997f5a3e3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:42:18 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:42:18.218 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[9a6c6c23-929e-46e2-8876-2a64a6ec8186]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:42:18 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:42:18.250 204665 DEBUG oslo.privsep.daemon [-] privsep: reply[048e3ba4-7dc4-471a-90f1-d96da94cb9b2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:42:18 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:42:18.267 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[fccb9071-6cc4-4881-98b0-2079062f0d3f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd9d38847-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:38:11:27'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 71], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 580364, 'reachable_time': 38520, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216020, 'error': None, 'target': 'ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:42:18 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:42:18.285 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[b6db05ef-5e62-44fd-8654-fb1f07d34aac]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd9d38847-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 580383, 'tstamp': 580383}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216021, 'error': None, 'target': 'ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd9d38847-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 580387, 'tstamp': 580387}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216021, 'error': None, 'target': 'ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:42:18 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:42:18.287 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd9d38847-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:42:18 compute-1 nova_compute[183403]: 2026-01-26 15:42:18.288 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:42:18 compute-1 nova_compute[183403]: 2026-01-26 15:42:18.292 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:42:18 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:42:18.293 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd9d38847-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:42:18 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:42:18.293 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:42:18 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:42:18.293 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd9d38847-a0, col_values=(('external_ids', {'iface-id': '70e6ec0a-21db-4d83-b0cb-0624424ede18'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:42:18 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:42:18.294 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 15:42:18 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:42:18.295 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[a1c29052-d946-4b56-9188-a06bd7733b3a]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-d9d38847-a43b-4d1e-a0b1-c3f77a879374\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/d9d38847-a43b-4d1e-a0b1-c3f77a879374.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID d9d38847-a43b-4d1e-a0b1-c3f77a879374\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:42:18 compute-1 nova_compute[183403]: 2026-01-26 15:42:18.367 183407 DEBUG nova.compute.manager [req-ceaeeea0-57f1-4936-a94e-a8b954482b0a req-b324b75a-3fd9-4acd-b5a8-f9dad3cf5532 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Received event network-vif-unplugged-b895845f-25a0-49ab-8be5-082a63b18d3d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:42:18 compute-1 nova_compute[183403]: 2026-01-26 15:42:18.368 183407 DEBUG oslo_concurrency.lockutils [req-ceaeeea0-57f1-4936-a94e-a8b954482b0a req-b324b75a-3fd9-4acd-b5a8-f9dad3cf5532 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:42:18 compute-1 nova_compute[183403]: 2026-01-26 15:42:18.368 183407 DEBUG oslo_concurrency.lockutils [req-ceaeeea0-57f1-4936-a94e-a8b954482b0a req-b324b75a-3fd9-4acd-b5a8-f9dad3cf5532 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:42:18 compute-1 nova_compute[183403]: 2026-01-26 15:42:18.368 183407 DEBUG oslo_concurrency.lockutils [req-ceaeeea0-57f1-4936-a94e-a8b954482b0a req-b324b75a-3fd9-4acd-b5a8-f9dad3cf5532 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:42:18 compute-1 nova_compute[183403]: 2026-01-26 15:42:18.369 183407 DEBUG nova.compute.manager [req-ceaeeea0-57f1-4936-a94e-a8b954482b0a req-b324b75a-3fd9-4acd-b5a8-f9dad3cf5532 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] No waiting events found dispatching network-vif-unplugged-b895845f-25a0-49ab-8be5-082a63b18d3d pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:42:18 compute-1 nova_compute[183403]: 2026-01-26 15:42:18.369 183407 DEBUG nova.compute.manager [req-ceaeeea0-57f1-4936-a94e-a8b954482b0a req-b324b75a-3fd9-4acd-b5a8-f9dad3cf5532 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Received event network-vif-unplugged-b895845f-25a0-49ab-8be5-082a63b18d3d for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 15:42:18 compute-1 nova_compute[183403]: 2026-01-26 15:42:18.382 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:42:18 compute-1 nova_compute[183403]: 2026-01-26 15:42:18.389 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:42:18 compute-1 nova_compute[183403]: 2026-01-26 15:42:18.439 183407 INFO nova.virt.libvirt.driver [-] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Instance destroyed successfully.
Jan 26 15:42:18 compute-1 nova_compute[183403]: 2026-01-26 15:42:18.440 183407 DEBUG nova.objects.instance [None req-04f4a74b-8ee1-4a63-8497-de7dd5787f04 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Lazy-loading 'resources' on Instance uuid 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:42:18 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:42:18.711 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=38, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:ac:38', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'de:96:40:90:f3:e5'}, ipsec=False) old=SB_Global(nb_cfg=37) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:42:18 compute-1 nova_compute[183403]: 2026-01-26 15:42:18.711 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:42:18 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:42:18.712 104930 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 15:42:19 compute-1 nova_compute[183403]: 2026-01-26 15:42:19.022 183407 DEBUG nova.virt.libvirt.vif [None req-04f4a74b-8ee1-4a63-8497-de7dd5787f04 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2026-01-26T15:41:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-1987266720',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-1987266720',id=33,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T15:41:22Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='af8eea38f1d74ad1a01087c020ea8d02',ramdisk_id='',reservation_id='r-vyy0mpry',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-1233966703',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-1233966703-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T15:41:22Z,user_data=None,user_id='136d3cfdd6cb48e2ab65221bcc05d26c',uuid=71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b895845f-25a0-49ab-8be5-082a63b18d3d", "address": "fa:16:3e:22:59:61", "network": {"id": "d9d38847-a43b-4d1e-a0b1-c3f77a879374", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1542262107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "312beda09adb420b9f44a490d7257008", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb895845f-25", "ovs_interfaceid": "b895845f-25a0-49ab-8be5-082a63b18d3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 26 15:42:19 compute-1 nova_compute[183403]: 2026-01-26 15:42:19.022 183407 DEBUG nova.network.os_vif_util [None req-04f4a74b-8ee1-4a63-8497-de7dd5787f04 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Converting VIF {"id": "b895845f-25a0-49ab-8be5-082a63b18d3d", "address": "fa:16:3e:22:59:61", "network": {"id": "d9d38847-a43b-4d1e-a0b1-c3f77a879374", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1542262107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "312beda09adb420b9f44a490d7257008", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb895845f-25", "ovs_interfaceid": "b895845f-25a0-49ab-8be5-082a63b18d3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:42:19 compute-1 nova_compute[183403]: 2026-01-26 15:42:19.023 183407 DEBUG nova.network.os_vif_util [None req-04f4a74b-8ee1-4a63-8497-de7dd5787f04 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:22:59:61,bridge_name='br-int',has_traffic_filtering=True,id=b895845f-25a0-49ab-8be5-082a63b18d3d,network=Network(d9d38847-a43b-4d1e-a0b1-c3f77a879374),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb895845f-25') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:42:19 compute-1 nova_compute[183403]: 2026-01-26 15:42:19.024 183407 DEBUG os_vif [None req-04f4a74b-8ee1-4a63-8497-de7dd5787f04 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:22:59:61,bridge_name='br-int',has_traffic_filtering=True,id=b895845f-25a0-49ab-8be5-082a63b18d3d,network=Network(d9d38847-a43b-4d1e-a0b1-c3f77a879374),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb895845f-25') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 26 15:42:19 compute-1 nova_compute[183403]: 2026-01-26 15:42:19.027 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:42:19 compute-1 nova_compute[183403]: 2026-01-26 15:42:19.027 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb895845f-25, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:42:19 compute-1 nova_compute[183403]: 2026-01-26 15:42:19.029 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:42:19 compute-1 nova_compute[183403]: 2026-01-26 15:42:19.031 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:42:19 compute-1 nova_compute[183403]: 2026-01-26 15:42:19.033 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:42:19 compute-1 nova_compute[183403]: 2026-01-26 15:42:19.033 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=eef71478-4552-4cfa-9c7f-3d311c99d418) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:42:19 compute-1 nova_compute[183403]: 2026-01-26 15:42:19.034 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:42:19 compute-1 nova_compute[183403]: 2026-01-26 15:42:19.036 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:42:19 compute-1 nova_compute[183403]: 2026-01-26 15:42:19.040 183407 INFO os_vif [None req-04f4a74b-8ee1-4a63-8497-de7dd5787f04 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:22:59:61,bridge_name='br-int',has_traffic_filtering=True,id=b895845f-25a0-49ab-8be5-082a63b18d3d,network=Network(d9d38847-a43b-4d1e-a0b1-c3f77a879374),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb895845f-25')
Jan 26 15:42:19 compute-1 nova_compute[183403]: 2026-01-26 15:42:19.041 183407 INFO nova.virt.libvirt.driver [None req-04f4a74b-8ee1-4a63-8497-de7dd5787f04 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Deleting instance files /var/lib/nova/instances/71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8_del
Jan 26 15:42:19 compute-1 nova_compute[183403]: 2026-01-26 15:42:19.042 183407 INFO nova.virt.libvirt.driver [None req-04f4a74b-8ee1-4a63-8497-de7dd5787f04 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Deletion of /var/lib/nova/instances/71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8_del complete
Jan 26 15:42:19 compute-1 openstack_network_exporter[195610]: ERROR   15:42:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:42:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:42:19 compute-1 openstack_network_exporter[195610]: ERROR   15:42:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:42:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:42:19 compute-1 nova_compute[183403]: 2026-01-26 15:42:19.597 183407 INFO nova.compute.manager [None req-04f4a74b-8ee1-4a63-8497-de7dd5787f04 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Took 1.64 seconds to destroy the instance on the hypervisor.
Jan 26 15:42:19 compute-1 nova_compute[183403]: 2026-01-26 15:42:19.598 183407 DEBUG oslo.service.backend._eventlet.loopingcall [None req-04f4a74b-8ee1-4a63-8497-de7dd5787f04 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Jan 26 15:42:19 compute-1 nova_compute[183403]: 2026-01-26 15:42:19.598 183407 DEBUG nova.compute.manager [-] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Jan 26 15:42:19 compute-1 nova_compute[183403]: 2026-01-26 15:42:19.599 183407 DEBUG nova.network.neutron [-] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Jan 26 15:42:19 compute-1 nova_compute[183403]: 2026-01-26 15:42:19.599 183407 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:42:19 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:42:19.714 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=41380b5a-e321-4ce4-bcc6-ecd563b3c793, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '38'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:42:20 compute-1 nova_compute[183403]: 2026-01-26 15:42:20.518 183407 DEBUG nova.compute.manager [req-c3360a15-0b2c-453f-9ee6-610bc7fb7c22 req-8c2528b9-ccfc-431d-90d0-9462d1014bfe 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Received event network-vif-unplugged-b895845f-25a0-49ab-8be5-082a63b18d3d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:42:20 compute-1 nova_compute[183403]: 2026-01-26 15:42:20.519 183407 DEBUG oslo_concurrency.lockutils [req-c3360a15-0b2c-453f-9ee6-610bc7fb7c22 req-8c2528b9-ccfc-431d-90d0-9462d1014bfe 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:42:20 compute-1 nova_compute[183403]: 2026-01-26 15:42:20.519 183407 DEBUG oslo_concurrency.lockutils [req-c3360a15-0b2c-453f-9ee6-610bc7fb7c22 req-8c2528b9-ccfc-431d-90d0-9462d1014bfe 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:42:20 compute-1 nova_compute[183403]: 2026-01-26 15:42:20.519 183407 DEBUG oslo_concurrency.lockutils [req-c3360a15-0b2c-453f-9ee6-610bc7fb7c22 req-8c2528b9-ccfc-431d-90d0-9462d1014bfe 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:42:20 compute-1 nova_compute[183403]: 2026-01-26 15:42:20.519 183407 DEBUG nova.compute.manager [req-c3360a15-0b2c-453f-9ee6-610bc7fb7c22 req-8c2528b9-ccfc-431d-90d0-9462d1014bfe 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] No waiting events found dispatching network-vif-unplugged-b895845f-25a0-49ab-8be5-082a63b18d3d pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:42:20 compute-1 nova_compute[183403]: 2026-01-26 15:42:20.520 183407 DEBUG nova.compute.manager [req-c3360a15-0b2c-453f-9ee6-610bc7fb7c22 req-8c2528b9-ccfc-431d-90d0-9462d1014bfe 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Received event network-vif-unplugged-b895845f-25a0-49ab-8be5-082a63b18d3d for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 15:42:20 compute-1 nova_compute[183403]: 2026-01-26 15:42:20.632 183407 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:42:22 compute-1 nova_compute[183403]: 2026-01-26 15:42:22.276 183407 DEBUG nova.compute.manager [req-ff1cb6bd-1fbd-4340-9871-6e6e2296ec73 req-3446dff5-f269-4f42-9291-89f79acc9caf 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Received event network-vif-deleted-b895845f-25a0-49ab-8be5-082a63b18d3d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:42:22 compute-1 nova_compute[183403]: 2026-01-26 15:42:22.277 183407 INFO nova.compute.manager [req-ff1cb6bd-1fbd-4340-9871-6e6e2296ec73 req-3446dff5-f269-4f42-9291-89f79acc9caf 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Neutron deleted interface b895845f-25a0-49ab-8be5-082a63b18d3d; detaching it from the instance and deleting it from the info cache
Jan 26 15:42:22 compute-1 nova_compute[183403]: 2026-01-26 15:42:22.277 183407 DEBUG nova.network.neutron [req-ff1cb6bd-1fbd-4340-9871-6e6e2296ec73 req-3446dff5-f269-4f42-9291-89f79acc9caf 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:42:22 compute-1 nova_compute[183403]: 2026-01-26 15:42:22.290 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:42:22 compute-1 nova_compute[183403]: 2026-01-26 15:42:22.762 183407 DEBUG nova.network.neutron [-] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:42:22 compute-1 nova_compute[183403]: 2026-01-26 15:42:22.988 183407 DEBUG nova.compute.manager [req-ff1cb6bd-1fbd-4340-9871-6e6e2296ec73 req-3446dff5-f269-4f42-9291-89f79acc9caf 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Detach interface failed, port_id=b895845f-25a0-49ab-8be5-082a63b18d3d, reason: Instance 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Jan 26 15:42:23 compute-1 nova_compute[183403]: 2026-01-26 15:42:23.507 183407 INFO nova.compute.manager [-] [instance: 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8] Took 3.91 seconds to deallocate network for instance.
Jan 26 15:42:24 compute-1 nova_compute[183403]: 2026-01-26 15:42:24.037 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:42:24 compute-1 nova_compute[183403]: 2026-01-26 15:42:24.299 183407 DEBUG oslo_concurrency.lockutils [None req-04f4a74b-8ee1-4a63-8497-de7dd5787f04 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:42:24 compute-1 nova_compute[183403]: 2026-01-26 15:42:24.300 183407 DEBUG oslo_concurrency.lockutils [None req-04f4a74b-8ee1-4a63-8497-de7dd5787f04 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:42:24 compute-1 nova_compute[183403]: 2026-01-26 15:42:24.367 183407 DEBUG nova.compute.provider_tree [None req-04f4a74b-8ee1-4a63-8497-de7dd5787f04 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:42:25 compute-1 nova_compute[183403]: 2026-01-26 15:42:25.884 183407 DEBUG nova.scheduler.client.report [None req-04f4a74b-8ee1-4a63-8497-de7dd5787f04 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:42:25 compute-1 podman[216041]: 2026-01-26 15:42:25.920087646 +0000 UTC m=+0.083503681 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Jan 26 15:42:25 compute-1 podman[216040]: 2026-01-26 15:42:25.968647049 +0000 UTC m=+0.131620923 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260120, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true)
Jan 26 15:42:26 compute-1 nova_compute[183403]: 2026-01-26 15:42:26.427 183407 DEBUG oslo_concurrency.lockutils [None req-04f4a74b-8ee1-4a63-8497-de7dd5787f04 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 2.127s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:42:26 compute-1 nova_compute[183403]: 2026-01-26 15:42:26.482 183407 INFO nova.scheduler.client.report [None req-04f4a74b-8ee1-4a63-8497-de7dd5787f04 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Deleted allocations for instance 71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8
Jan 26 15:42:27 compute-1 nova_compute[183403]: 2026-01-26 15:42:27.096 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:42:27 compute-1 nova_compute[183403]: 2026-01-26 15:42:27.292 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:42:27 compute-1 nova_compute[183403]: 2026-01-26 15:42:27.516 183407 DEBUG oslo_concurrency.lockutils [None req-04f4a74b-8ee1-4a63-8497-de7dd5787f04 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Lock "71f8ec67-10fd-4ff7-b1d2-3e168cb87ff8" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 10.100s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:42:28 compute-1 nova_compute[183403]: 2026-01-26 15:42:28.211 183407 DEBUG oslo_concurrency.lockutils [None req-53b9ded1-7ac5-4d98-a159-298d02482679 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Acquiring lock "945ea863-2fb6-4499-8ade-804189f3375f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:42:28 compute-1 nova_compute[183403]: 2026-01-26 15:42:28.212 183407 DEBUG oslo_concurrency.lockutils [None req-53b9ded1-7ac5-4d98-a159-298d02482679 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Lock "945ea863-2fb6-4499-8ade-804189f3375f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:42:28 compute-1 nova_compute[183403]: 2026-01-26 15:42:28.213 183407 DEBUG oslo_concurrency.lockutils [None req-53b9ded1-7ac5-4d98-a159-298d02482679 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Acquiring lock "945ea863-2fb6-4499-8ade-804189f3375f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:42:28 compute-1 nova_compute[183403]: 2026-01-26 15:42:28.213 183407 DEBUG oslo_concurrency.lockutils [None req-53b9ded1-7ac5-4d98-a159-298d02482679 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Lock "945ea863-2fb6-4499-8ade-804189f3375f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:42:28 compute-1 nova_compute[183403]: 2026-01-26 15:42:28.213 183407 DEBUG oslo_concurrency.lockutils [None req-53b9ded1-7ac5-4d98-a159-298d02482679 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Lock "945ea863-2fb6-4499-8ade-804189f3375f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:42:28 compute-1 nova_compute[183403]: 2026-01-26 15:42:28.230 183407 INFO nova.compute.manager [None req-53b9ded1-7ac5-4d98-a159-298d02482679 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: 945ea863-2fb6-4499-8ade-804189f3375f] Terminating instance
Jan 26 15:42:28 compute-1 nova_compute[183403]: 2026-01-26 15:42:28.765 183407 DEBUG nova.compute.manager [None req-53b9ded1-7ac5-4d98-a159-298d02482679 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: 945ea863-2fb6-4499-8ade-804189f3375f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Jan 26 15:42:28 compute-1 kernel: tapa2fc018e-f9 (unregistering): left promiscuous mode
Jan 26 15:42:28 compute-1 NetworkManager[55716]: <info>  [1769442148.8029] device (tapa2fc018e-f9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 15:42:28 compute-1 ovn_controller[95641]: 2026-01-26T15:42:28Z|00269|binding|INFO|Releasing lport a2fc018e-f92a-4576-b5f0-2d9d66e80c23 from this chassis (sb_readonly=0)
Jan 26 15:42:28 compute-1 ovn_controller[95641]: 2026-01-26T15:42:28Z|00270|binding|INFO|Setting lport a2fc018e-f92a-4576-b5f0-2d9d66e80c23 down in Southbound
Jan 26 15:42:28 compute-1 ovn_controller[95641]: 2026-01-26T15:42:28Z|00271|binding|INFO|Removing iface tapa2fc018e-f9 ovn-installed in OVS
Jan 26 15:42:28 compute-1 nova_compute[183403]: 2026-01-26 15:42:28.810 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:42:28 compute-1 nova_compute[183403]: 2026-01-26 15:42:28.812 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:42:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:42:28.817 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fa:22:00 10.100.0.7'], port_security=['fa:16:3e:fa:22:00 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '945ea863-2fb6-4499-8ade-804189f3375f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d9d38847-a43b-4d1e-a0b1-c3f77a879374', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'af8eea38f1d74ad1a01087c020ea8d02', 'neutron:revision_number': '14', 'neutron:security_group_ids': '41d84080-c1ba-42a6-8417-b57b30232ea3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f2b7e03b-5278-4177-91b7-862e57a7c9ad, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>], logical_port=a2fc018e-f92a-4576-b5f0-2d9d66e80c23) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa7d0c921b0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:42:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:42:28.818 104930 INFO neutron.agent.ovn.metadata.agent [-] Port a2fc018e-f92a-4576-b5f0-2d9d66e80c23 in datapath d9d38847-a43b-4d1e-a0b1-c3f77a879374 unbound from our chassis
Jan 26 15:42:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:42:28.819 104930 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d9d38847-a43b-4d1e-a0b1-c3f77a879374, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 15:42:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:42:28.821 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[930b0b8b-ffd0-4e02-8625-bc0e17d3edf4]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:42:28 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:42:28.821 104930 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374 namespace which is not needed anymore
Jan 26 15:42:28 compute-1 nova_compute[183403]: 2026-01-26 15:42:28.827 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:42:28 compute-1 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d00000020.scope: Deactivated successfully.
Jan 26 15:42:28 compute-1 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d00000020.scope: Consumed 3.966s CPU time.
Jan 26 15:42:28 compute-1 systemd-machined[154697]: Machine qemu-25-instance-00000020 terminated.
Jan 26 15:42:28 compute-1 neutron-haproxy-ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374[215708]: [NOTICE]   (215731) : haproxy version is 3.0.5-8e879a5
Jan 26 15:42:28 compute-1 neutron-haproxy-ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374[215708]: [NOTICE]   (215731) : path to executable is /usr/sbin/haproxy
Jan 26 15:42:28 compute-1 neutron-haproxy-ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374[215708]: [WARNING]  (215731) : Exiting Master process...
Jan 26 15:42:28 compute-1 podman[216111]: 2026-01-26 15:42:28.935390027 +0000 UTC m=+0.030315180 container kill aed80525937a5c00fb8d97dd13d75156f80d547c63e3606b42161a44e1c933e3 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2)
Jan 26 15:42:28 compute-1 neutron-haproxy-ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374[215708]: [ALERT]    (215731) : Current worker (215733) exited with code 143 (Terminated)
Jan 26 15:42:28 compute-1 neutron-haproxy-ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374[215708]: [WARNING]  (215731) : All workers exited. Exiting... (0)
Jan 26 15:42:28 compute-1 systemd[1]: libpod-aed80525937a5c00fb8d97dd13d75156f80d547c63e3606b42161a44e1c933e3.scope: Deactivated successfully.
Jan 26 15:42:28 compute-1 podman[216127]: 2026-01-26 15:42:28.975355242 +0000 UTC m=+0.022019230 container died aed80525937a5c00fb8d97dd13d75156f80d547c63e3606b42161a44e1c933e3 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true)
Jan 26 15:42:29 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-aed80525937a5c00fb8d97dd13d75156f80d547c63e3606b42161a44e1c933e3-userdata-shm.mount: Deactivated successfully.
Jan 26 15:42:29 compute-1 systemd[1]: var-lib-containers-storage-overlay-269752733cf3f9223258ac003f284996460fff72e6d92bc82b012d41e1185d5a-merged.mount: Deactivated successfully.
Jan 26 15:42:29 compute-1 podman[216127]: 2026-01-26 15:42:29.020436446 +0000 UTC m=+0.067100404 container cleanup aed80525937a5c00fb8d97dd13d75156f80d547c63e3606b42161a44e1c933e3 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Jan 26 15:42:29 compute-1 systemd[1]: libpod-conmon-aed80525937a5c00fb8d97dd13d75156f80d547c63e3606b42161a44e1c933e3.scope: Deactivated successfully.
Jan 26 15:42:29 compute-1 nova_compute[183403]: 2026-01-26 15:42:29.038 183407 INFO nova.virt.libvirt.driver [-] [instance: 945ea863-2fb6-4499-8ade-804189f3375f] Instance destroyed successfully.
Jan 26 15:42:29 compute-1 nova_compute[183403]: 2026-01-26 15:42:29.039 183407 DEBUG nova.objects.instance [None req-53b9ded1-7ac5-4d98-a159-298d02482679 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Lazy-loading 'resources' on Instance uuid 945ea863-2fb6-4499-8ade-804189f3375f obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 15:42:29 compute-1 nova_compute[183403]: 2026-01-26 15:42:29.091 183407 DEBUG nova.virt.libvirt.vif [None req-53b9ded1-7ac5-4d98-a159-298d02482679 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2026-01-26T15:40:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-1953533895',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-1953533895',id=32,image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T15:41:00Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='af8eea38f1d74ad1a01087c020ea8d02',ramdisk_id='',reservation_id='r-4n2bj1fm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',clean_attempts='1',image_base_image_ref='354e4d0e-4287-404f-93d3-2c85cfe92fbc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-1233966703',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-1233966703-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T15:42:03Z,user_data=None,user_id='136d3cfdd6cb48e2ab65221bcc05d26c',uuid=945ea863-2fb6-4499-8ade-804189f3375f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a2fc018e-f92a-4576-b5f0-2d9d66e80c23", "address": "fa:16:3e:fa:22:00", "network": {"id": "d9d38847-a43b-4d1e-a0b1-c3f77a879374", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1542262107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "312beda09adb420b9f44a490d7257008", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2fc018e-f9", "ovs_interfaceid": "a2fc018e-f92a-4576-b5f0-2d9d66e80c23", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 26 15:42:29 compute-1 nova_compute[183403]: 2026-01-26 15:42:29.092 183407 DEBUG nova.network.os_vif_util [None req-53b9ded1-7ac5-4d98-a159-298d02482679 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Converting VIF {"id": "a2fc018e-f92a-4576-b5f0-2d9d66e80c23", "address": "fa:16:3e:fa:22:00", "network": {"id": "d9d38847-a43b-4d1e-a0b1-c3f77a879374", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1542262107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "312beda09adb420b9f44a490d7257008", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2fc018e-f9", "ovs_interfaceid": "a2fc018e-f92a-4576-b5f0-2d9d66e80c23", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 15:42:29 compute-1 nova_compute[183403]: 2026-01-26 15:42:29.092 183407 DEBUG nova.network.os_vif_util [None req-53b9ded1-7ac5-4d98-a159-298d02482679 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:fa:22:00,bridge_name='br-int',has_traffic_filtering=True,id=a2fc018e-f92a-4576-b5f0-2d9d66e80c23,network=Network(d9d38847-a43b-4d1e-a0b1-c3f77a879374),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2fc018e-f9') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 15:42:29 compute-1 nova_compute[183403]: 2026-01-26 15:42:29.092 183407 DEBUG os_vif [None req-53b9ded1-7ac5-4d98-a159-298d02482679 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:fa:22:00,bridge_name='br-int',has_traffic_filtering=True,id=a2fc018e-f92a-4576-b5f0-2d9d66e80c23,network=Network(d9d38847-a43b-4d1e-a0b1-c3f77a879374),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2fc018e-f9') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 26 15:42:29 compute-1 nova_compute[183403]: 2026-01-26 15:42:29.094 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:42:29 compute-1 nova_compute[183403]: 2026-01-26 15:42:29.095 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:42:29 compute-1 nova_compute[183403]: 2026-01-26 15:42:29.095 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa2fc018e-f9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:42:29 compute-1 podman[216129]: 2026-01-26 15:42:29.096156238 +0000 UTC m=+0.135429329 container remove aed80525937a5c00fb8d97dd13d75156f80d547c63e3606b42161a44e1c933e3 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Jan 26 15:42:29 compute-1 nova_compute[183403]: 2026-01-26 15:42:29.096 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:42:29 compute-1 nova_compute[183403]: 2026-01-26 15:42:29.098 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:42:29 compute-1 nova_compute[183403]: 2026-01-26 15:42:29.099 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:42:29 compute-1 nova_compute[183403]: 2026-01-26 15:42:29.099 183407 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=642f78f8-54ac-4c17-9f46-b81b44edf90b) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:42:29 compute-1 nova_compute[183403]: 2026-01-26 15:42:29.099 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:42:29 compute-1 nova_compute[183403]: 2026-01-26 15:42:29.101 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:42:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:42:29.101 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[9405e9f5-9b59-4ea3-8f59-185f74acec0f]: (4, ("Mon Jan 26 03:42:28 PM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374 (aed80525937a5c00fb8d97dd13d75156f80d547c63e3606b42161a44e1c933e3)\naed80525937a5c00fb8d97dd13d75156f80d547c63e3606b42161a44e1c933e3\nMon Jan 26 03:42:28 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374 (aed80525937a5c00fb8d97dd13d75156f80d547c63e3606b42161a44e1c933e3)\naed80525937a5c00fb8d97dd13d75156f80d547c63e3606b42161a44e1c933e3\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:42:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:42:29.102 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:42:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:42:29.103 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:42:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:42:29.103 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:42:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:42:29.103 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[2bb078dc-ab8d-452d-8bb8-ca546511ac43]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:42:29 compute-1 nova_compute[183403]: 2026-01-26 15:42:29.103 183407 INFO os_vif [None req-53b9ded1-7ac5-4d98-a159-298d02482679 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:fa:22:00,bridge_name='br-int',has_traffic_filtering=True,id=a2fc018e-f92a-4576-b5f0-2d9d66e80c23,network=Network(d9d38847-a43b-4d1e-a0b1-c3f77a879374),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2fc018e-f9')
Jan 26 15:42:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:42:29.104 104930 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d9d38847-a43b-4d1e-a0b1-c3f77a879374.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d9d38847-a43b-4d1e-a0b1-c3f77a879374.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 15:42:29 compute-1 nova_compute[183403]: 2026-01-26 15:42:29.104 183407 INFO nova.virt.libvirt.driver [None req-53b9ded1-7ac5-4d98-a159-298d02482679 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: 945ea863-2fb6-4499-8ade-804189f3375f] Deleting instance files /var/lib/nova/instances/945ea863-2fb6-4499-8ade-804189f3375f_del
Jan 26 15:42:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:42:29.104 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[2b7bca04-9285-4e85-9b7e-4807738906b3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:42:29 compute-1 nova_compute[183403]: 2026-01-26 15:42:29.104 183407 INFO nova.virt.libvirt.driver [None req-53b9ded1-7ac5-4d98-a159-298d02482679 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: 945ea863-2fb6-4499-8ade-804189f3375f] Deletion of /var/lib/nova/instances/945ea863-2fb6-4499-8ade-804189f3375f_del complete
Jan 26 15:42:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:42:29.104 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd9d38847-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:42:29 compute-1 kernel: tapd9d38847-a0: left promiscuous mode
Jan 26 15:42:29 compute-1 nova_compute[183403]: 2026-01-26 15:42:29.107 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:42:29 compute-1 nova_compute[183403]: 2026-01-26 15:42:29.119 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:42:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:42:29.122 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[f040038b-7819-4bcf-a3f4-b31b54fac755]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:42:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:42:29.140 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[3eb50ccb-e8a1-48d4-afe3-243eeec679b5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:42:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:42:29.142 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[e7851ad9-ffea-4d87-9e54-d9997cab3897]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:42:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:42:29.159 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[b90b7ac1-d5a5-4468-8825-9963b55e3e47]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 580356, 'reachable_time': 23088, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216177, 'error': None, 'target': 'ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:42:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:42:29.162 105448 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d9d38847-a43b-4d1e-a0b1-c3f77a879374 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Jan 26 15:42:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:42:29.162 105448 DEBUG oslo.privsep.daemon [-] privsep: reply[583a6d19-579e-4381-8d6b-2f49b808acea]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:42:29 compute-1 systemd[1]: run-netns-ovnmeta\x2dd9d38847\x2da43b\x2d4d1e\x2da0b1\x2dc3f77a879374.mount: Deactivated successfully.
Jan 26 15:42:29 compute-1 nova_compute[183403]: 2026-01-26 15:42:29.618 183407 INFO nova.compute.manager [None req-53b9ded1-7ac5-4d98-a159-298d02482679 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] [instance: 945ea863-2fb6-4499-8ade-804189f3375f] Took 0.85 seconds to destroy the instance on the hypervisor.
Jan 26 15:42:29 compute-1 nova_compute[183403]: 2026-01-26 15:42:29.618 183407 DEBUG oslo.service.backend._eventlet.loopingcall [None req-53b9ded1-7ac5-4d98-a159-298d02482679 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Jan 26 15:42:29 compute-1 nova_compute[183403]: 2026-01-26 15:42:29.619 183407 DEBUG nova.compute.manager [-] [instance: 945ea863-2fb6-4499-8ade-804189f3375f] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Jan 26 15:42:29 compute-1 nova_compute[183403]: 2026-01-26 15:42:29.619 183407 DEBUG nova.network.neutron [-] [instance: 945ea863-2fb6-4499-8ade-804189f3375f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Jan 26 15:42:29 compute-1 nova_compute[183403]: 2026-01-26 15:42:29.619 183407 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:42:29 compute-1 nova_compute[183403]: 2026-01-26 15:42:29.729 183407 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 15:42:29 compute-1 nova_compute[183403]: 2026-01-26 15:42:29.763 183407 DEBUG nova.compute.manager [req-6be651f0-5f8c-400b-b385-89697cc01c4c req-e1c12723-f842-4e29-86ad-f5191e21bb42 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 945ea863-2fb6-4499-8ade-804189f3375f] Received event network-vif-unplugged-a2fc018e-f92a-4576-b5f0-2d9d66e80c23 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:42:29 compute-1 nova_compute[183403]: 2026-01-26 15:42:29.764 183407 DEBUG oslo_concurrency.lockutils [req-6be651f0-5f8c-400b-b385-89697cc01c4c req-e1c12723-f842-4e29-86ad-f5191e21bb42 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "945ea863-2fb6-4499-8ade-804189f3375f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:42:29 compute-1 nova_compute[183403]: 2026-01-26 15:42:29.764 183407 DEBUG oslo_concurrency.lockutils [req-6be651f0-5f8c-400b-b385-89697cc01c4c req-e1c12723-f842-4e29-86ad-f5191e21bb42 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "945ea863-2fb6-4499-8ade-804189f3375f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:42:29 compute-1 nova_compute[183403]: 2026-01-26 15:42:29.764 183407 DEBUG oslo_concurrency.lockutils [req-6be651f0-5f8c-400b-b385-89697cc01c4c req-e1c12723-f842-4e29-86ad-f5191e21bb42 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "945ea863-2fb6-4499-8ade-804189f3375f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:42:29 compute-1 nova_compute[183403]: 2026-01-26 15:42:29.765 183407 DEBUG nova.compute.manager [req-6be651f0-5f8c-400b-b385-89697cc01c4c req-e1c12723-f842-4e29-86ad-f5191e21bb42 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 945ea863-2fb6-4499-8ade-804189f3375f] No waiting events found dispatching network-vif-unplugged-a2fc018e-f92a-4576-b5f0-2d9d66e80c23 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:42:29 compute-1 nova_compute[183403]: 2026-01-26 15:42:29.765 183407 DEBUG nova.compute.manager [req-6be651f0-5f8c-400b-b385-89697cc01c4c req-e1c12723-f842-4e29-86ad-f5191e21bb42 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 945ea863-2fb6-4499-8ade-804189f3375f] Received event network-vif-unplugged-a2fc018e-f92a-4576-b5f0-2d9d66e80c23 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 15:42:30 compute-1 nova_compute[183403]: 2026-01-26 15:42:30.329 183407 DEBUG nova.compute.manager [req-09f65f68-b13c-40e2-97fd-c9df0755aa56 req-6af3b66e-915d-4ce1-b701-18147eab32c3 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 945ea863-2fb6-4499-8ade-804189f3375f] Received event network-vif-deleted-a2fc018e-f92a-4576-b5f0-2d9d66e80c23 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:42:30 compute-1 nova_compute[183403]: 2026-01-26 15:42:30.329 183407 INFO nova.compute.manager [req-09f65f68-b13c-40e2-97fd-c9df0755aa56 req-6af3b66e-915d-4ce1-b701-18147eab32c3 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 945ea863-2fb6-4499-8ade-804189f3375f] Neutron deleted interface a2fc018e-f92a-4576-b5f0-2d9d66e80c23; detaching it from the instance and deleting it from the info cache
Jan 26 15:42:30 compute-1 nova_compute[183403]: 2026-01-26 15:42:30.329 183407 DEBUG nova.network.neutron [req-09f65f68-b13c-40e2-97fd-c9df0755aa56 req-6af3b66e-915d-4ce1-b701-18147eab32c3 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 945ea863-2fb6-4499-8ade-804189f3375f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:42:30 compute-1 nova_compute[183403]: 2026-01-26 15:42:30.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:42:30 compute-1 nova_compute[183403]: 2026-01-26 15:42:30.746 183407 DEBUG nova.network.neutron [-] [instance: 945ea863-2fb6-4499-8ade-804189f3375f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 15:42:30 compute-1 nova_compute[183403]: 2026-01-26 15:42:30.840 183407 DEBUG nova.compute.manager [req-09f65f68-b13c-40e2-97fd-c9df0755aa56 req-6af3b66e-915d-4ce1-b701-18147eab32c3 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 945ea863-2fb6-4499-8ade-804189f3375f] Detach interface failed, port_id=a2fc018e-f92a-4576-b5f0-2d9d66e80c23, reason: Instance 945ea863-2fb6-4499-8ade-804189f3375f could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Jan 26 15:42:31 compute-1 nova_compute[183403]: 2026-01-26 15:42:31.244 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:42:31 compute-1 nova_compute[183403]: 2026-01-26 15:42:31.245 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:42:31 compute-1 nova_compute[183403]: 2026-01-26 15:42:31.246 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:42:31 compute-1 nova_compute[183403]: 2026-01-26 15:42:31.246 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:42:31 compute-1 nova_compute[183403]: 2026-01-26 15:42:31.306 183407 INFO nova.compute.manager [-] [instance: 945ea863-2fb6-4499-8ade-804189f3375f] Took 1.69 seconds to deallocate network for instance.
Jan 26 15:42:31 compute-1 nova_compute[183403]: 2026-01-26 15:42:31.436 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:42:31 compute-1 nova_compute[183403]: 2026-01-26 15:42:31.437 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:42:31 compute-1 nova_compute[183403]: 2026-01-26 15:42:31.460 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.022s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:42:31 compute-1 nova_compute[183403]: 2026-01-26 15:42:31.461 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5774MB free_disk=73.14462661743164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:42:31 compute-1 nova_compute[183403]: 2026-01-26 15:42:31.461 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:42:31 compute-1 nova_compute[183403]: 2026-01-26 15:42:31.462 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:42:31 compute-1 nova_compute[183403]: 2026-01-26 15:42:31.832 183407 DEBUG oslo_concurrency.lockutils [None req-53b9ded1-7ac5-4d98-a159-298d02482679 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:42:32 compute-1 nova_compute[183403]: 2026-01-26 15:42:32.067 183407 DEBUG nova.compute.manager [req-6bffcf8a-7655-4fc6-ae5a-4acd1449c243 req-f6e8a4c6-c042-47e9-ab6c-ebcdfce2b332 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 945ea863-2fb6-4499-8ade-804189f3375f] Received event network-vif-unplugged-a2fc018e-f92a-4576-b5f0-2d9d66e80c23 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 15:42:32 compute-1 nova_compute[183403]: 2026-01-26 15:42:32.068 183407 DEBUG oslo_concurrency.lockutils [req-6bffcf8a-7655-4fc6-ae5a-4acd1449c243 req-f6e8a4c6-c042-47e9-ab6c-ebcdfce2b332 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Acquiring lock "945ea863-2fb6-4499-8ade-804189f3375f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:42:32 compute-1 nova_compute[183403]: 2026-01-26 15:42:32.068 183407 DEBUG oslo_concurrency.lockutils [req-6bffcf8a-7655-4fc6-ae5a-4acd1449c243 req-f6e8a4c6-c042-47e9-ab6c-ebcdfce2b332 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "945ea863-2fb6-4499-8ade-804189f3375f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:42:32 compute-1 nova_compute[183403]: 2026-01-26 15:42:32.069 183407 DEBUG oslo_concurrency.lockutils [req-6bffcf8a-7655-4fc6-ae5a-4acd1449c243 req-f6e8a4c6-c042-47e9-ab6c-ebcdfce2b332 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] Lock "945ea863-2fb6-4499-8ade-804189f3375f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:42:32 compute-1 nova_compute[183403]: 2026-01-26 15:42:32.069 183407 DEBUG nova.compute.manager [req-6bffcf8a-7655-4fc6-ae5a-4acd1449c243 req-f6e8a4c6-c042-47e9-ab6c-ebcdfce2b332 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 945ea863-2fb6-4499-8ade-804189f3375f] No waiting events found dispatching network-vif-unplugged-a2fc018e-f92a-4576-b5f0-2d9d66e80c23 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 15:42:32 compute-1 nova_compute[183403]: 2026-01-26 15:42:32.069 183407 WARNING nova.compute.manager [req-6bffcf8a-7655-4fc6-ae5a-4acd1449c243 req-f6e8a4c6-c042-47e9-ab6c-ebcdfce2b332 355968777ebc473bab93f0520b8ff65c efd5236a68844b42ac3b0c5bd372db1e - - default default] [instance: 945ea863-2fb6-4499-8ade-804189f3375f] Received unexpected event network-vif-unplugged-a2fc018e-f92a-4576-b5f0-2d9d66e80c23 for instance with vm_state deleted and task_state None.
Jan 26 15:42:32 compute-1 nova_compute[183403]: 2026-01-26 15:42:32.294 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:42:32 compute-1 nova_compute[183403]: 2026-01-26 15:42:32.559 183407 WARNING nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Instance 945ea863-2fb6-4499-8ade-804189f3375f is not being actively managed by this compute host but has allocations referencing this compute host: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. Skipping heal of allocation because we do not know what to do.
Jan 26 15:42:32 compute-1 nova_compute[183403]: 2026-01-26 15:42:32.559 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:42:32 compute-1 nova_compute[183403]: 2026-01-26 15:42:32.559 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:42:31 up  1:37,  0 user,  load average: 0.26, 0.22, 0.20\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:42:32 compute-1 nova_compute[183403]: 2026-01-26 15:42:32.590 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:42:33 compute-1 nova_compute[183403]: 2026-01-26 15:42:33.176 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:42:33 compute-1 nova_compute[183403]: 2026-01-26 15:42:33.687 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:42:33 compute-1 nova_compute[183403]: 2026-01-26 15:42:33.688 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.226s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:42:33 compute-1 nova_compute[183403]: 2026-01-26 15:42:33.688 183407 DEBUG oslo_concurrency.lockutils [None req-53b9ded1-7ac5-4d98-a159-298d02482679 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 1.856s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:42:33 compute-1 nova_compute[183403]: 2026-01-26 15:42:33.717 183407 DEBUG oslo_concurrency.lockutils [None req-53b9ded1-7ac5-4d98-a159-298d02482679 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.028s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:42:33 compute-1 nova_compute[183403]: 2026-01-26 15:42:33.765 183407 INFO nova.scheduler.client.report [None req-53b9ded1-7ac5-4d98-a159-298d02482679 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Deleted allocations for instance 945ea863-2fb6-4499-8ade-804189f3375f
Jan 26 15:42:34 compute-1 nova_compute[183403]: 2026-01-26 15:42:34.102 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:42:34 compute-1 nova_compute[183403]: 2026-01-26 15:42:34.949 183407 DEBUG oslo_concurrency.lockutils [None req-53b9ded1-7ac5-4d98-a159-298d02482679 136d3cfdd6cb48e2ab65221bcc05d26c af8eea38f1d74ad1a01087c020ea8d02 - - default default] Lock "945ea863-2fb6-4499-8ade-804189f3375f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.736s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:42:35 compute-1 podman[192725]: time="2026-01-26T15:42:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:42:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:42:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:42:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:42:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2193 "" "Go-http-client/1.1"
Jan 26 15:42:37 compute-1 nova_compute[183403]: 2026-01-26 15:42:37.296 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:42:38 compute-1 nova_compute[183403]: 2026-01-26 15:42:38.690 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:42:38 compute-1 nova_compute[183403]: 2026-01-26 15:42:38.690 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:42:38 compute-1 nova_compute[183403]: 2026-01-26 15:42:38.690 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:42:38 compute-1 nova_compute[183403]: 2026-01-26 15:42:38.691 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:42:38 compute-1 nova_compute[183403]: 2026-01-26 15:42:38.691 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:42:38 compute-1 nova_compute[183403]: 2026-01-26 15:42:38.691 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 15:42:39 compute-1 nova_compute[183403]: 2026-01-26 15:42:39.106 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:42:39 compute-1 nova_compute[183403]: 2026-01-26 15:42:39.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:42:41 compute-1 nova_compute[183403]: 2026-01-26 15:42:41.936 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:42:42 compute-1 nova_compute[183403]: 2026-01-26 15:42:42.299 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:42:44 compute-1 nova_compute[183403]: 2026-01-26 15:42:44.109 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:42:46 compute-1 podman[216180]: 2026-01-26 15:42:46.890501291 +0000 UTC m=+0.066407526 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 15:42:46 compute-1 podman[216181]: 2026-01-26 15:42:46.89947679 +0000 UTC m=+0.075737775 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal)
Jan 26 15:42:47 compute-1 nova_compute[183403]: 2026-01-26 15:42:47.301 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:42:49 compute-1 nova_compute[183403]: 2026-01-26 15:42:49.112 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:42:49 compute-1 openstack_network_exporter[195610]: ERROR   15:42:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:42:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:42:49 compute-1 openstack_network_exporter[195610]: ERROR   15:42:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:42:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:42:50 compute-1 nova_compute[183403]: 2026-01-26 15:42:50.572 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:42:52 compute-1 nova_compute[183403]: 2026-01-26 15:42:52.336 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:42:54 compute-1 nova_compute[183403]: 2026-01-26 15:42:54.115 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:42:56 compute-1 podman[216221]: 2026-01-26 15:42:56.892259997 +0000 UTC m=+0.073518327 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 26 15:42:56 compute-1 podman[216220]: 2026-01-26 15:42:56.901886041 +0000 UTC m=+0.085192603 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Jan 26 15:42:56 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:42:56.983 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fb:74:6e 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-b3af1642-5f64-4a2a-b863-cb24c2f64020', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b3af1642-5f64-4a2a-b863-cb24c2f64020', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1dc45b4b07e644f39165e6ae4e4c98dd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=987c7236-5ecf-4927-817a-c319edc8126f, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=37342c34-08b0-4490-9464-8ba697def283) old=Port_Binding(mac=['fa:16:3e:fb:74:6e'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-b3af1642-5f64-4a2a-b863-cb24c2f64020', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b3af1642-5f64-4a2a-b863-cb24c2f64020', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1dc45b4b07e644f39165e6ae4e4c98dd', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:42:56 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:42:56.984 104930 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 37342c34-08b0-4490-9464-8ba697def283 in datapath b3af1642-5f64-4a2a-b863-cb24c2f64020 updated
Jan 26 15:42:56 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:42:56.984 104930 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b3af1642-5f64-4a2a-b863-cb24c2f64020, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 15:42:56 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:42:56.985 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[9838636a-c35f-4b69-8688-8048075f4d5a]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:42:57 compute-1 nova_compute[183403]: 2026-01-26 15:42:57.339 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:42:59 compute-1 nova_compute[183403]: 2026-01-26 15:42:59.117 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:43:02 compute-1 nova_compute[183403]: 2026-01-26 15:43:02.341 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:43:04 compute-1 nova_compute[183403]: 2026-01-26 15:43:04.120 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:43:05 compute-1 podman[192725]: time="2026-01-26T15:43:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:43:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:43:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:43:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:43:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2197 "" "Go-http-client/1.1"
Jan 26 15:43:07 compute-1 nova_compute[183403]: 2026-01-26 15:43:07.346 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:43:08 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:43:08.045 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e8:64:bf 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-48f1fbe7-d24b-4eea-bca9-3c870110f123', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-48f1fbe7-d24b-4eea-bca9-3c870110f123', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '10dbe47a321a4103babc503f4f40bf9b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9adff5cb-0bce-4ed9-b1ce-ff09a51a4303, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=e0dd1fa1-e6fd-43b7-87e5-08b6e5abfd4d) old=Port_Binding(mac=['fa:16:3e:e8:64:bf'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-48f1fbe7-d24b-4eea-bca9-3c870110f123', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-48f1fbe7-d24b-4eea-bca9-3c870110f123', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '10dbe47a321a4103babc503f4f40bf9b', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:43:08 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:43:08.046 104930 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port e0dd1fa1-e6fd-43b7-87e5-08b6e5abfd4d in datapath 48f1fbe7-d24b-4eea-bca9-3c870110f123 updated
Jan 26 15:43:08 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:43:08.047 104930 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 48f1fbe7-d24b-4eea-bca9-3c870110f123, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 15:43:08 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:43:08.048 203506 DEBUG oslo.privsep.daemon [-] privsep: reply[8f893cdc-bf47-46c2-9621-8ab9b2fe2283]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 15:43:09 compute-1 nova_compute[183403]: 2026-01-26 15:43:09.123 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:43:12 compute-1 nova_compute[183403]: 2026-01-26 15:43:12.347 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:43:14 compute-1 nova_compute[183403]: 2026-01-26 15:43:14.125 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:43:17 compute-1 nova_compute[183403]: 2026-01-26 15:43:17.395 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:43:17 compute-1 podman[216264]: 2026-01-26 15:43:17.900158392 +0000 UTC m=+0.072973753 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 15:43:17 compute-1 podman[216265]: 2026-01-26 15:43:17.91191614 +0000 UTC m=+0.084834175 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-type=git, container_name=openstack_network_exporter, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, release=1755695350, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 26 15:43:18 compute-1 ovn_controller[95641]: 2026-01-26T15:43:18Z|00272|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 26 15:43:19 compute-1 nova_compute[183403]: 2026-01-26 15:43:19.128 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:43:19 compute-1 openstack_network_exporter[195610]: ERROR   15:43:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:43:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:43:19 compute-1 openstack_network_exporter[195610]: ERROR   15:43:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:43:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:43:22 compute-1 nova_compute[183403]: 2026-01-26 15:43:22.397 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:43:24 compute-1 nova_compute[183403]: 2026-01-26 15:43:24.171 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:43:27 compute-1 nova_compute[183403]: 2026-01-26 15:43:27.399 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:43:27 compute-1 nova_compute[183403]: 2026-01-26 15:43:27.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:43:27 compute-1 podman[216312]: 2026-01-26 15:43:27.887522094 +0000 UTC m=+0.061489442 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20260120, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Jan 26 15:43:27 compute-1 podman[216311]: 2026-01-26 15:43:27.915275248 +0000 UTC m=+0.096197713 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible)
Jan 26 15:43:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:43:29.104 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:43:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:43:29.105 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:43:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:43:29.105 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:43:29 compute-1 nova_compute[183403]: 2026-01-26 15:43:29.173 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:43:30 compute-1 nova_compute[183403]: 2026-01-26 15:43:30.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:43:31 compute-1 nova_compute[183403]: 2026-01-26 15:43:31.092 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:43:31 compute-1 nova_compute[183403]: 2026-01-26 15:43:31.092 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:43:31 compute-1 nova_compute[183403]: 2026-01-26 15:43:31.093 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:43:31 compute-1 nova_compute[183403]: 2026-01-26 15:43:31.093 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:43:31 compute-1 nova_compute[183403]: 2026-01-26 15:43:31.221 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:43:31 compute-1 nova_compute[183403]: 2026-01-26 15:43:31.222 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:43:31 compute-1 nova_compute[183403]: 2026-01-26 15:43:31.247 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.024s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:43:31 compute-1 nova_compute[183403]: 2026-01-26 15:43:31.247 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5840MB free_disk=73.14462661743164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:43:31 compute-1 nova_compute[183403]: 2026-01-26 15:43:31.248 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:43:31 compute-1 nova_compute[183403]: 2026-01-26 15:43:31.248 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:43:32 compute-1 nova_compute[183403]: 2026-01-26 15:43:32.346 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:43:32 compute-1 nova_compute[183403]: 2026-01-26 15:43:32.347 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:43:31 up  1:38,  0 user,  load average: 0.09, 0.18, 0.18\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:43:32 compute-1 nova_compute[183403]: 2026-01-26 15:43:32.403 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:43:32 compute-1 nova_compute[183403]: 2026-01-26 15:43:32.408 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Refreshing inventories for resource provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Jan 26 15:43:32 compute-1 nova_compute[183403]: 2026-01-26 15:43:32.452 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Updating ProviderTree inventory for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Jan 26 15:43:32 compute-1 nova_compute[183403]: 2026-01-26 15:43:32.453 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Updating inventory in ProviderTree for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Jan 26 15:43:32 compute-1 nova_compute[183403]: 2026-01-26 15:43:32.465 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Refreshing aggregate associations for resource provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Jan 26 15:43:32 compute-1 nova_compute[183403]: 2026-01-26 15:43:32.487 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Refreshing trait associations for resource provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67, traits: COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE41,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NODE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_MMX,COMPUTE_ACCELERATORS,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_ARCH_X86_64,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE2,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_SOUND_MODEL_USB,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_TIS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_DEVICE_TAGGING,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_CRB,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_SOUND_MODEL_SB16,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_QCOW2,HW_ARCH_X86_64,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SECURITY_TPM_2_0 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Jan 26 15:43:32 compute-1 nova_compute[183403]: 2026-01-26 15:43:32.513 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:43:33 compute-1 nova_compute[183403]: 2026-01-26 15:43:33.022 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:43:33 compute-1 nova_compute[183403]: 2026-01-26 15:43:33.533 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:43:33 compute-1 nova_compute[183403]: 2026-01-26 15:43:33.534 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.286s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:43:34 compute-1 nova_compute[183403]: 2026-01-26 15:43:34.213 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:43:35 compute-1 podman[192725]: time="2026-01-26T15:43:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:43:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:43:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:43:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:43:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2195 "" "Go-http-client/1.1"
Jan 26 15:43:37 compute-1 nova_compute[183403]: 2026-01-26 15:43:37.431 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:43:39 compute-1 nova_compute[183403]: 2026-01-26 15:43:39.216 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:43:39 compute-1 nova_compute[183403]: 2026-01-26 15:43:39.534 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:43:39 compute-1 nova_compute[183403]: 2026-01-26 15:43:39.535 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:43:39 compute-1 nova_compute[183403]: 2026-01-26 15:43:39.535 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:43:39 compute-1 nova_compute[183403]: 2026-01-26 15:43:39.535 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:43:39 compute-1 nova_compute[183403]: 2026-01-26 15:43:39.536 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 15:43:40 compute-1 nova_compute[183403]: 2026-01-26 15:43:40.573 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:43:40 compute-1 nova_compute[183403]: 2026-01-26 15:43:40.575 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:43:42 compute-1 nova_compute[183403]: 2026-01-26 15:43:42.433 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:43:44 compute-1 nova_compute[183403]: 2026-01-26 15:43:44.218 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:43:47 compute-1 nova_compute[183403]: 2026-01-26 15:43:47.443 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:43:48 compute-1 podman[216359]: 2026-01-26 15:43:48.925554184 +0000 UTC m=+0.091743859 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, maintainer=Red Hat, Inc., managed_by=edpm_ansible, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7)
Jan 26 15:43:48 compute-1 podman[216358]: 2026-01-26 15:43:48.932400838 +0000 UTC m=+0.095604377 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 15:43:49 compute-1 nova_compute[183403]: 2026-01-26 15:43:49.222 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:43:49 compute-1 openstack_network_exporter[195610]: ERROR   15:43:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:43:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:43:49 compute-1 openstack_network_exporter[195610]: ERROR   15:43:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:43:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:43:51 compute-1 sshd-session[216402]: banner exchange: Connection from 3.132.23.201 port 55088: invalid format
Jan 26 15:43:52 compute-1 nova_compute[183403]: 2026-01-26 15:43:52.446 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:43:54 compute-1 nova_compute[183403]: 2026-01-26 15:43:54.225 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:43:54 compute-1 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 26 15:43:57 compute-1 nova_compute[183403]: 2026-01-26 15:43:57.449 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:43:58 compute-1 podman[216405]: 2026-01-26 15:43:58.900027608 +0000 UTC m=+0.076588484 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20260120, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent)
Jan 26 15:43:58 compute-1 podman[216404]: 2026-01-26 15:43:58.925374752 +0000 UTC m=+0.109745567 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20260120)
Jan 26 15:43:59 compute-1 nova_compute[183403]: 2026-01-26 15:43:59.227 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:44:02 compute-1 nova_compute[183403]: 2026-01-26 15:44:02.451 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:44:04 compute-1 nova_compute[183403]: 2026-01-26 15:44:04.229 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:44:05 compute-1 podman[192725]: time="2026-01-26T15:44:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:44:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:44:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:44:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:44:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2195 "" "Go-http-client/1.1"
Jan 26 15:44:07 compute-1 nova_compute[183403]: 2026-01-26 15:44:07.455 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:44:09 compute-1 nova_compute[183403]: 2026-01-26 15:44:09.231 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:44:12 compute-1 nova_compute[183403]: 2026-01-26 15:44:12.456 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:44:14 compute-1 nova_compute[183403]: 2026-01-26 15:44:14.234 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:44:17 compute-1 nova_compute[183403]: 2026-01-26 15:44:17.458 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:44:19 compute-1 nova_compute[183403]: 2026-01-26 15:44:19.237 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:44:19 compute-1 openstack_network_exporter[195610]: ERROR   15:44:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:44:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:44:19 compute-1 openstack_network_exporter[195610]: ERROR   15:44:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:44:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:44:19 compute-1 podman[216448]: 2026-01-26 15:44:19.88119934 +0000 UTC m=+0.058363952 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 26 15:44:19 compute-1 podman[216449]: 2026-01-26 15:44:19.937468796 +0000 UTC m=+0.099816267 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., version=9.6, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, name=ubi9-minimal, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.openshift.expose-services=, managed_by=edpm_ansible)
Jan 26 15:44:22 compute-1 nova_compute[183403]: 2026-01-26 15:44:22.460 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:44:24 compute-1 nova_compute[183403]: 2026-01-26 15:44:24.240 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:44:27 compute-1 nova_compute[183403]: 2026-01-26 15:44:27.462 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:44:27 compute-1 sshd-session[216493]: banner exchange: Connection from 3.132.23.201 port 40476: invalid format
Jan 26 15:44:28 compute-1 nova_compute[183403]: 2026-01-26 15:44:28.577 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:44:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:44:29.106 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:44:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:44:29.106 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:44:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:44:29.107 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:44:29 compute-1 nova_compute[183403]: 2026-01-26 15:44:29.242 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:44:29 compute-1 podman[216496]: 2026-01-26 15:44:29.873280611 +0000 UTC m=+0.052147994 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 26 15:44:29 compute-1 podman[216495]: 2026-01-26 15:44:29.922651559 +0000 UTC m=+0.105346836 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_controller)
Jan 26 15:44:32 compute-1 nova_compute[183403]: 2026-01-26 15:44:32.510 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:44:32 compute-1 nova_compute[183403]: 2026-01-26 15:44:32.513 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:44:33 compute-1 nova_compute[183403]: 2026-01-26 15:44:33.022 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:44:33 compute-1 nova_compute[183403]: 2026-01-26 15:44:33.023 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:44:33 compute-1 nova_compute[183403]: 2026-01-26 15:44:33.023 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:44:33 compute-1 nova_compute[183403]: 2026-01-26 15:44:33.023 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:44:33 compute-1 nova_compute[183403]: 2026-01-26 15:44:33.244 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:44:33 compute-1 nova_compute[183403]: 2026-01-26 15:44:33.246 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:44:33 compute-1 nova_compute[183403]: 2026-01-26 15:44:33.267 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.021s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:44:33 compute-1 nova_compute[183403]: 2026-01-26 15:44:33.268 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5858MB free_disk=73.14462661743164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:44:33 compute-1 nova_compute[183403]: 2026-01-26 15:44:33.268 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:44:33 compute-1 nova_compute[183403]: 2026-01-26 15:44:33.268 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:44:34 compute-1 nova_compute[183403]: 2026-01-26 15:44:34.245 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:44:34 compute-1 nova_compute[183403]: 2026-01-26 15:44:34.321 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:44:34 compute-1 nova_compute[183403]: 2026-01-26 15:44:34.321 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:44:33 up  1:39,  0 user,  load average: 0.03, 0.14, 0.17\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:44:34 compute-1 nova_compute[183403]: 2026-01-26 15:44:34.345 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:44:34 compute-1 nova_compute[183403]: 2026-01-26 15:44:34.854 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:44:35 compute-1 nova_compute[183403]: 2026-01-26 15:44:35.379 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:44:35 compute-1 nova_compute[183403]: 2026-01-26 15:44:35.380 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.111s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:44:35 compute-1 podman[192725]: time="2026-01-26T15:44:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:44:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:44:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:44:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:44:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2196 "" "Go-http-client/1.1"
Jan 26 15:44:36 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:44:36.735 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=39, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:ac:38', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'de:96:40:90:f3:e5'}, ipsec=False) old=SB_Global(nb_cfg=38) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:44:36 compute-1 nova_compute[183403]: 2026-01-26 15:44:36.736 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:44:36 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:44:36.737 104930 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 15:44:37 compute-1 nova_compute[183403]: 2026-01-26 15:44:37.516 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:44:39 compute-1 nova_compute[183403]: 2026-01-26 15:44:39.247 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:44:39 compute-1 nova_compute[183403]: 2026-01-26 15:44:39.445 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:44:39 compute-1 nova_compute[183403]: 2026-01-26 15:44:39.446 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:44:39 compute-1 nova_compute[183403]: 2026-01-26 15:44:39.446 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:44:39 compute-1 nova_compute[183403]: 2026-01-26 15:44:39.446 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 15:44:39 compute-1 nova_compute[183403]: 2026-01-26 15:44:39.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:44:40 compute-1 nova_compute[183403]: 2026-01-26 15:44:40.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:44:42 compute-1 nova_compute[183403]: 2026-01-26 15:44:42.518 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:44:42 compute-1 nova_compute[183403]: 2026-01-26 15:44:42.571 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:44:42 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:44:42.738 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=41380b5a-e321-4ce4-bcc6-ecd563b3c793, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '39'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:44:44 compute-1 nova_compute[183403]: 2026-01-26 15:44:44.249 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:44:47 compute-1 nova_compute[183403]: 2026-01-26 15:44:47.521 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:44:49 compute-1 nova_compute[183403]: 2026-01-26 15:44:49.294 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:44:49 compute-1 openstack_network_exporter[195610]: ERROR   15:44:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:44:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:44:49 compute-1 openstack_network_exporter[195610]: ERROR   15:44:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:44:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:44:50 compute-1 podman[216538]: 2026-01-26 15:44:50.89446976 +0000 UTC m=+0.060802038 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 26 15:44:50 compute-1 podman[216539]: 2026-01-26 15:44:50.921556965 +0000 UTC m=+0.092190370 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, container_name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 26 15:44:52 compute-1 nova_compute[183403]: 2026-01-26 15:44:52.524 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:44:53 compute-1 nova_compute[183403]: 2026-01-26 15:44:53.571 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:44:54 compute-1 nova_compute[183403]: 2026-01-26 15:44:54.297 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:44:57 compute-1 nova_compute[183403]: 2026-01-26 15:44:57.527 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:44:59 compute-1 nova_compute[183403]: 2026-01-26 15:44:59.298 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:45:00 compute-1 podman[216584]: 2026-01-26 15:45:00.876184142 +0000 UTC m=+0.055349711 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_build_tag=watcher_latest)
Jan 26 15:45:00 compute-1 podman[216583]: 2026-01-26 15:45:00.941561564 +0000 UTC m=+0.124694720 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 15:45:02 compute-1 nova_compute[183403]: 2026-01-26 15:45:02.529 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:45:04 compute-1 nova_compute[183403]: 2026-01-26 15:45:04.301 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:45:05 compute-1 podman[192725]: time="2026-01-26T15:45:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:45:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:45:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:45:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:45:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2197 "" "Go-http-client/1.1"
Jan 26 15:45:07 compute-1 nova_compute[183403]: 2026-01-26 15:45:07.530 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:45:09 compute-1 nova_compute[183403]: 2026-01-26 15:45:09.305 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:45:12 compute-1 nova_compute[183403]: 2026-01-26 15:45:12.533 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:45:14 compute-1 nova_compute[183403]: 2026-01-26 15:45:14.307 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:45:17 compute-1 nova_compute[183403]: 2026-01-26 15:45:17.535 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:45:19 compute-1 nova_compute[183403]: 2026-01-26 15:45:19.310 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:45:19 compute-1 openstack_network_exporter[195610]: ERROR   15:45:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:45:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:45:19 compute-1 openstack_network_exporter[195610]: ERROR   15:45:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:45:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:45:21 compute-1 podman[216627]: 2026-01-26 15:45:21.905640667 +0000 UTC m=+0.068662852 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, config_id=openstack_network_exporter, name=ubi9-minimal, maintainer=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., distribution-scope=public)
Jan 26 15:45:21 compute-1 podman[216626]: 2026-01-26 15:45:21.911305181 +0000 UTC m=+0.085356485 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 15:45:22 compute-1 nova_compute[183403]: 2026-01-26 15:45:22.540 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:45:24 compute-1 nova_compute[183403]: 2026-01-26 15:45:24.313 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:45:27 compute-1 nova_compute[183403]: 2026-01-26 15:45:27.544 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:45:28 compute-1 nova_compute[183403]: 2026-01-26 15:45:28.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:45:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:45:29.107 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:45:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:45:29.108 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:45:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:45:29.108 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:45:29 compute-1 nova_compute[183403]: 2026-01-26 15:45:29.351 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:45:31 compute-1 nova_compute[183403]: 2026-01-26 15:45:31.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:45:31 compute-1 podman[216673]: 2026-01-26 15:45:31.880254434 +0000 UTC m=+0.052252677 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260120, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Jan 26 15:45:31 compute-1 podman[216672]: 2026-01-26 15:45:31.923332022 +0000 UTC m=+0.094836162 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 15:45:32 compute-1 nova_compute[183403]: 2026-01-26 15:45:32.097 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:45:32 compute-1 nova_compute[183403]: 2026-01-26 15:45:32.098 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:45:32 compute-1 nova_compute[183403]: 2026-01-26 15:45:32.098 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:45:32 compute-1 nova_compute[183403]: 2026-01-26 15:45:32.098 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:45:32 compute-1 nova_compute[183403]: 2026-01-26 15:45:32.228 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:45:32 compute-1 nova_compute[183403]: 2026-01-26 15:45:32.229 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:45:32 compute-1 nova_compute[183403]: 2026-01-26 15:45:32.252 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.023s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:45:32 compute-1 nova_compute[183403]: 2026-01-26 15:45:32.253 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5858MB free_disk=73.14432144165039GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:45:32 compute-1 nova_compute[183403]: 2026-01-26 15:45:32.253 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:45:32 compute-1 nova_compute[183403]: 2026-01-26 15:45:32.253 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:45:32 compute-1 nova_compute[183403]: 2026-01-26 15:45:32.546 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:45:33 compute-1 nova_compute[183403]: 2026-01-26 15:45:33.300 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:45:33 compute-1 nova_compute[183403]: 2026-01-26 15:45:33.300 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:45:32 up  1:40,  0 user,  load average: 0.01, 0.11, 0.16\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:45:33 compute-1 nova_compute[183403]: 2026-01-26 15:45:33.318 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:45:33 compute-1 nova_compute[183403]: 2026-01-26 15:45:33.826 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:45:34 compute-1 nova_compute[183403]: 2026-01-26 15:45:34.336 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:45:34 compute-1 nova_compute[183403]: 2026-01-26 15:45:34.337 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.083s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:45:34 compute-1 nova_compute[183403]: 2026-01-26 15:45:34.353 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:45:35 compute-1 podman[192725]: time="2026-01-26T15:45:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:45:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:45:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:45:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:45:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2198 "" "Go-http-client/1.1"
Jan 26 15:45:37 compute-1 nova_compute[183403]: 2026-01-26 15:45:37.547 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:45:39 compute-1 nova_compute[183403]: 2026-01-26 15:45:39.337 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:45:39 compute-1 nova_compute[183403]: 2026-01-26 15:45:39.338 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:45:39 compute-1 nova_compute[183403]: 2026-01-26 15:45:39.338 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 15:45:39 compute-1 nova_compute[183403]: 2026-01-26 15:45:39.357 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:45:40 compute-1 nova_compute[183403]: 2026-01-26 15:45:40.577 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:45:41 compute-1 nova_compute[183403]: 2026-01-26 15:45:41.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:45:42 compute-1 nova_compute[183403]: 2026-01-26 15:45:42.550 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:45:42 compute-1 nova_compute[183403]: 2026-01-26 15:45:42.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:45:43 compute-1 nova_compute[183403]: 2026-01-26 15:45:43.571 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:45:44 compute-1 nova_compute[183403]: 2026-01-26 15:45:44.359 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:45:47 compute-1 nova_compute[183403]: 2026-01-26 15:45:47.553 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:45:49 compute-1 nova_compute[183403]: 2026-01-26 15:45:49.362 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:45:49 compute-1 openstack_network_exporter[195610]: ERROR   15:45:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:45:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:45:49 compute-1 openstack_network_exporter[195610]: ERROR   15:45:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:45:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:45:52 compute-1 nova_compute[183403]: 2026-01-26 15:45:52.554 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:45:52 compute-1 podman[216720]: 2026-01-26 15:45:52.909635804 +0000 UTC m=+0.084153942 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 26 15:45:52 compute-1 podman[216721]: 2026-01-26 15:45:52.920596591 +0000 UTC m=+0.081675535 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, release=1755695350, vendor=Red Hat, Inc., config_id=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., distribution-scope=public, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 26 15:45:54 compute-1 nova_compute[183403]: 2026-01-26 15:45:54.365 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:45:55 compute-1 sshd-session[216763]: Accepted publickey for zuul from 192.168.122.10 port 40172 ssh2: ECDSA SHA256:jdvz2lyr5EyIYu+bvBKw53Euf9HIejup115smDZmKeU
Jan 26 15:45:55 compute-1 systemd-logind[795]: New session 40 of user zuul.
Jan 26 15:45:55 compute-1 systemd[1]: Started Session 40 of User zuul.
Jan 26 15:45:55 compute-1 sshd-session[216763]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 15:45:55 compute-1 sudo[216767]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Jan 26 15:45:55 compute-1 sudo[216767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 15:45:57 compute-1 nova_compute[183403]: 2026-01-26 15:45:57.556 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:45:59 compute-1 nova_compute[183403]: 2026-01-26 15:45:59.369 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:46:00 compute-1 ovs-vsctl[216938]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Jan 26 15:46:01 compute-1 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 216791 (sos)
Jan 26 15:46:01 compute-1 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Jan 26 15:46:01 compute-1 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Jan 26 15:46:01 compute-1 virtqemud[183290]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Jan 26 15:46:01 compute-1 virtqemud[183290]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Jan 26 15:46:01 compute-1 virtqemud[183290]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 26 15:46:02 compute-1 podman[217139]: 2026-01-26 15:46:02.354052741 +0000 UTC m=+0.116209841 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 26 15:46:02 compute-1 podman[217131]: 2026-01-26 15:46:02.365134171 +0000 UTC m=+0.126959503 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible)
Jan 26 15:46:02 compute-1 nova_compute[183403]: 2026-01-26 15:46:02.557 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:46:03 compute-1 crontab[217382]: (root) LIST (root)
Jan 26 15:46:04 compute-1 nova_compute[183403]: 2026-01-26 15:46:04.372 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:46:05 compute-1 systemd[1]: Starting Hostname Service...
Jan 26 15:46:05 compute-1 systemd[1]: Started Hostname Service.
Jan 26 15:46:05 compute-1 podman[192725]: time="2026-01-26T15:46:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:46:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:46:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:46:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:46:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2197 "" "Go-http-client/1.1"
Jan 26 15:46:07 compute-1 nova_compute[183403]: 2026-01-26 15:46:07.560 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:46:09 compute-1 nova_compute[183403]: 2026-01-26 15:46:09.381 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:46:12 compute-1 ovs-appctl[218530]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 26 15:46:12 compute-1 ovs-appctl[218533]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 26 15:46:12 compute-1 ovs-appctl[218538]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 26 15:46:12 compute-1 nova_compute[183403]: 2026-01-26 15:46:12.561 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:46:14 compute-1 nova_compute[183403]: 2026-01-26 15:46:14.384 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:46:17 compute-1 nova_compute[183403]: 2026-01-26 15:46:17.562 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:46:19 compute-1 nova_compute[183403]: 2026-01-26 15:46:19.387 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:46:19 compute-1 openstack_network_exporter[195610]: ERROR   15:46:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:46:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:46:19 compute-1 openstack_network_exporter[195610]: ERROR   15:46:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:46:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:46:21 compute-1 sshd-session[219545]: banner exchange: Connection from 3.132.23.201 port 60710: invalid format
Jan 26 15:46:22 compute-1 nova_compute[183403]: 2026-01-26 15:46:22.566 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:46:23 compute-1 podman[219805]: 2026-01-26 15:46:23.022857979 +0000 UTC m=+0.071609662 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 15:46:23 compute-1 podman[219808]: 2026-01-26 15:46:23.023588679 +0000 UTC m=+0.063641796 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, distribution-scope=public, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 26 15:46:23 compute-1 virtqemud[183290]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 26 15:46:24 compute-1 nova_compute[183403]: 2026-01-26 15:46:24.390 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:46:25 compute-1 systemd[1]: Starting Time & Date Service...
Jan 26 15:46:25 compute-1 systemd[1]: Started Time & Date Service.
Jan 26 15:46:27 compute-1 nova_compute[183403]: 2026-01-26 15:46:27.567 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:46:28 compute-1 nova_compute[183403]: 2026-01-26 15:46:28.577 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:46:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:46:29.109 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:46:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:46:29.109 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:46:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:46:29.109 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:46:29 compute-1 nova_compute[183403]: 2026-01-26 15:46:29.406 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:46:32 compute-1 nova_compute[183403]: 2026-01-26 15:46:32.606 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:46:32 compute-1 podman[220054]: 2026-01-26 15:46:32.889200802 +0000 UTC m=+0.059987490 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2)
Jan 26 15:46:32 compute-1 podman[220053]: 2026-01-26 15:46:32.959347065 +0000 UTC m=+0.122434034 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 15:46:33 compute-1 nova_compute[183403]: 2026-01-26 15:46:33.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:46:34 compute-1 nova_compute[183403]: 2026-01-26 15:46:34.106 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:46:34 compute-1 nova_compute[183403]: 2026-01-26 15:46:34.107 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:46:34 compute-1 nova_compute[183403]: 2026-01-26 15:46:34.107 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:46:34 compute-1 nova_compute[183403]: 2026-01-26 15:46:34.107 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:46:34 compute-1 nova_compute[183403]: 2026-01-26 15:46:34.302 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:46:34 compute-1 nova_compute[183403]: 2026-01-26 15:46:34.304 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:46:34 compute-1 nova_compute[183403]: 2026-01-26 15:46:34.335 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.031s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:46:34 compute-1 nova_compute[183403]: 2026-01-26 15:46:34.336 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5552MB free_disk=72.70772552490234GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:46:34 compute-1 nova_compute[183403]: 2026-01-26 15:46:34.337 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:46:34 compute-1 nova_compute[183403]: 2026-01-26 15:46:34.337 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:46:34 compute-1 nova_compute[183403]: 2026-01-26 15:46:34.409 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:46:35 compute-1 nova_compute[183403]: 2026-01-26 15:46:35.398 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:46:35 compute-1 nova_compute[183403]: 2026-01-26 15:46:35.399 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:46:34 up  1:41,  0 user,  load average: 1.29, 0.42, 0.26\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:46:35 compute-1 nova_compute[183403]: 2026-01-26 15:46:35.420 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:46:35 compute-1 podman[192725]: time="2026-01-26T15:46:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:46:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:46:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:46:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:46:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2193 "" "Go-http-client/1.1"
Jan 26 15:46:35 compute-1 nova_compute[183403]: 2026-01-26 15:46:35.931 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:46:36 compute-1 nova_compute[183403]: 2026-01-26 15:46:36.446 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:46:36 compute-1 nova_compute[183403]: 2026-01-26 15:46:36.447 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.110s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:46:37 compute-1 nova_compute[183403]: 2026-01-26 15:46:37.606 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:46:38 compute-1 nova_compute[183403]: 2026-01-26 15:46:38.577 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:46:38 compute-1 nova_compute[183403]: 2026-01-26 15:46:38.578 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:46:38 compute-1 nova_compute[183403]: 2026-01-26 15:46:38.579 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Jan 26 15:46:39 compute-1 nova_compute[183403]: 2026-01-26 15:46:39.412 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:46:40 compute-1 nova_compute[183403]: 2026-01-26 15:46:40.086 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:46:40 compute-1 nova_compute[183403]: 2026-01-26 15:46:40.088 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 15:46:41 compute-1 nova_compute[183403]: 2026-01-26 15:46:41.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:46:41 compute-1 nova_compute[183403]: 2026-01-26 15:46:41.577 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Jan 26 15:46:42 compute-1 nova_compute[183403]: 2026-01-26 15:46:42.088 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Jan 26 15:46:42 compute-1 nova_compute[183403]: 2026-01-26 15:46:42.613 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:46:42 compute-1 sshd-session[220099]: banner exchange: Connection from 3.132.23.201 port 42536: invalid format
Jan 26 15:46:43 compute-1 nova_compute[183403]: 2026-01-26 15:46:43.089 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:46:43 compute-1 nova_compute[183403]: 2026-01-26 15:46:43.090 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:46:43 compute-1 nova_compute[183403]: 2026-01-26 15:46:43.090 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:46:43 compute-1 sudo[216767]: pam_unix(sudo:session): session closed for user root
Jan 26 15:46:43 compute-1 sshd-session[216766]: Received disconnect from 192.168.122.10 port 40172:11: disconnected by user
Jan 26 15:46:43 compute-1 sshd-session[216766]: Disconnected from user zuul 192.168.122.10 port 40172
Jan 26 15:46:43 compute-1 sshd-session[216763]: pam_unix(sshd:session): session closed for user zuul
Jan 26 15:46:43 compute-1 systemd-logind[795]: Session 40 logged out. Waiting for processes to exit.
Jan 26 15:46:43 compute-1 systemd[1]: session-40.scope: Deactivated successfully.
Jan 26 15:46:43 compute-1 systemd[1]: session-40.scope: Consumed 1min 21.206s CPU time, 545.0M memory peak, read 153.8M from disk, written 32.5M to disk.
Jan 26 15:46:43 compute-1 systemd-logind[795]: Removed session 40.
Jan 26 15:46:43 compute-1 sshd-session[220100]: Accepted publickey for zuul from 192.168.122.10 port 58004 ssh2: ECDSA SHA256:jdvz2lyr5EyIYu+bvBKw53Euf9HIejup115smDZmKeU
Jan 26 15:46:44 compute-1 systemd-logind[795]: New session 41 of user zuul.
Jan 26 15:46:44 compute-1 systemd[1]: Started Session 41 of User zuul.
Jan 26 15:46:44 compute-1 sshd-session[220100]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 15:46:44 compute-1 sudo[220104]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/cat /var/tmp/sos-osp/sosreport-compute-1-2026-01-26-jwfuhhx.tar.xz
Jan 26 15:46:44 compute-1 sudo[220104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 15:46:44 compute-1 sudo[220104]: pam_unix(sudo:session): session closed for user root
Jan 26 15:46:44 compute-1 sshd-session[220103]: Received disconnect from 192.168.122.10 port 58004:11: disconnected by user
Jan 26 15:46:44 compute-1 sshd-session[220103]: Disconnected from user zuul 192.168.122.10 port 58004
Jan 26 15:46:44 compute-1 sshd-session[220100]: pam_unix(sshd:session): session closed for user zuul
Jan 26 15:46:44 compute-1 systemd[1]: session-41.scope: Deactivated successfully.
Jan 26 15:46:44 compute-1 systemd-logind[795]: Session 41 logged out. Waiting for processes to exit.
Jan 26 15:46:44 compute-1 systemd-logind[795]: Removed session 41.
Jan 26 15:46:44 compute-1 sshd-session[220129]: Accepted publickey for zuul from 192.168.122.10 port 58018 ssh2: ECDSA SHA256:jdvz2lyr5EyIYu+bvBKw53Euf9HIejup115smDZmKeU
Jan 26 15:46:44 compute-1 systemd-logind[795]: New session 42 of user zuul.
Jan 26 15:46:44 compute-1 systemd[1]: Started Session 42 of User zuul.
Jan 26 15:46:44 compute-1 sshd-session[220129]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 15:46:44 compute-1 nova_compute[183403]: 2026-01-26 15:46:44.415 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:46:44 compute-1 sudo[220133]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rm -rf /var/tmp/sos-osp
Jan 26 15:46:44 compute-1 sudo[220133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 15:46:44 compute-1 sudo[220133]: pam_unix(sudo:session): session closed for user root
Jan 26 15:46:44 compute-1 sshd-session[220132]: Received disconnect from 192.168.122.10 port 58018:11: disconnected by user
Jan 26 15:46:44 compute-1 nova_compute[183403]: 2026-01-26 15:46:44.572 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:46:44 compute-1 sshd-session[220132]: Disconnected from user zuul 192.168.122.10 port 58018
Jan 26 15:46:44 compute-1 sshd-session[220129]: pam_unix(sshd:session): session closed for user zuul
Jan 26 15:46:44 compute-1 systemd-logind[795]: Session 42 logged out. Waiting for processes to exit.
Jan 26 15:46:44 compute-1 systemd[1]: session-42.scope: Deactivated successfully.
Jan 26 15:46:44 compute-1 systemd-logind[795]: Removed session 42.
Jan 26 15:46:47 compute-1 nova_compute[183403]: 2026-01-26 15:46:47.613 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:46:49 compute-1 openstack_network_exporter[195610]: ERROR   15:46:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:46:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:46:49 compute-1 openstack_network_exporter[195610]: ERROR   15:46:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:46:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:46:49 compute-1 nova_compute[183403]: 2026-01-26 15:46:49.454 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:46:52 compute-1 nova_compute[183403]: 2026-01-26 15:46:52.616 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:46:53 compute-1 podman[220159]: 2026-01-26 15:46:53.93617066 +0000 UTC m=+0.098680643 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 15:46:53 compute-1 podman[220160]: 2026-01-26 15:46:53.945231315 +0000 UTC m=+0.111034188 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, version=9.6, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, managed_by=edpm_ansible, name=ubi9-minimal, build-date=2025-08-20T13:12:41, architecture=x86_64, config_id=openstack_network_exporter, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc.)
Jan 26 15:46:54 compute-1 nova_compute[183403]: 2026-01-26 15:46:54.457 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:46:55 compute-1 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 26 15:46:55 compute-1 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 26 15:46:57 compute-1 nova_compute[183403]: 2026-01-26 15:46:57.617 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:46:58 compute-1 nova_compute[183403]: 2026-01-26 15:46:58.572 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:46:59 compute-1 nova_compute[183403]: 2026-01-26 15:46:59.459 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:47:02 compute-1 nova_compute[183403]: 2026-01-26 15:47:02.620 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:47:03 compute-1 podman[220210]: 2026-01-26 15:47:03.936847327 +0000 UTC m=+0.099638569 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 26 15:47:03 compute-1 podman[220209]: 2026-01-26 15:47:03.972295093 +0000 UTC m=+0.140251415 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 26 15:47:04 compute-1 nova_compute[183403]: 2026-01-26 15:47:04.462 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:47:04 compute-1 nova_compute[183403]: 2026-01-26 15:47:04.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:47:05 compute-1 podman[192725]: time="2026-01-26T15:47:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:47:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:47:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:47:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:47:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2196 "" "Go-http-client/1.1"
Jan 26 15:47:07 compute-1 nova_compute[183403]: 2026-01-26 15:47:07.622 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:47:09 compute-1 nova_compute[183403]: 2026-01-26 15:47:09.464 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:47:12 compute-1 nova_compute[183403]: 2026-01-26 15:47:12.624 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:47:14 compute-1 nova_compute[183403]: 2026-01-26 15:47:14.466 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:47:17 compute-1 nova_compute[183403]: 2026-01-26 15:47:17.627 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:47:19 compute-1 openstack_network_exporter[195610]: ERROR   15:47:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:47:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:47:19 compute-1 openstack_network_exporter[195610]: ERROR   15:47:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:47:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:47:19 compute-1 nova_compute[183403]: 2026-01-26 15:47:19.467 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:47:22 compute-1 nova_compute[183403]: 2026-01-26 15:47:22.628 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:47:24 compute-1 nova_compute[183403]: 2026-01-26 15:47:24.513 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:47:24 compute-1 podman[220254]: 2026-01-26 15:47:24.884147335 +0000 UTC m=+0.065787617 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 15:47:24 compute-1 podman[220255]: 2026-01-26 15:47:24.922802147 +0000 UTC m=+0.096604648 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, release=1755695350, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter)
Jan 26 15:47:25 compute-1 nova_compute[183403]: 2026-01-26 15:47:25.034 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:47:27 compute-1 nova_compute[183403]: 2026-01-26 15:47:27.630 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:47:29 compute-1 nova_compute[183403]: 2026-01-26 15:47:29.091 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:47:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:47:29.111 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:47:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:47:29.112 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:47:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:47:29.112 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:47:29 compute-1 nova_compute[183403]: 2026-01-26 15:47:29.515 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:47:32 compute-1 nova_compute[183403]: 2026-01-26 15:47:32.632 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:47:34 compute-1 nova_compute[183403]: 2026-01-26 15:47:34.518 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:47:34 compute-1 podman[220301]: 2026-01-26 15:47:34.916687631 +0000 UTC m=+0.084180932 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260120, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 26 15:47:34 compute-1 podman[220300]: 2026-01-26 15:47:34.918801959 +0000 UTC m=+0.089864237 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=watcher_latest)
Jan 26 15:47:35 compute-1 nova_compute[183403]: 2026-01-26 15:47:35.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:47:35 compute-1 podman[192725]: time="2026-01-26T15:47:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:47:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:47:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:47:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:47:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2196 "" "Go-http-client/1.1"
Jan 26 15:47:36 compute-1 nova_compute[183403]: 2026-01-26 15:47:36.102 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:47:36 compute-1 nova_compute[183403]: 2026-01-26 15:47:36.102 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:47:36 compute-1 nova_compute[183403]: 2026-01-26 15:47:36.103 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:47:36 compute-1 nova_compute[183403]: 2026-01-26 15:47:36.103 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:47:36 compute-1 nova_compute[183403]: 2026-01-26 15:47:36.264 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:47:36 compute-1 nova_compute[183403]: 2026-01-26 15:47:36.265 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:47:36 compute-1 nova_compute[183403]: 2026-01-26 15:47:36.287 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.022s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:47:36 compute-1 nova_compute[183403]: 2026-01-26 15:47:36.288 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5800MB free_disk=73.14408493041992GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:47:36 compute-1 nova_compute[183403]: 2026-01-26 15:47:36.288 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:47:36 compute-1 nova_compute[183403]: 2026-01-26 15:47:36.288 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:47:37 compute-1 nova_compute[183403]: 2026-01-26 15:47:37.398 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:47:37 compute-1 nova_compute[183403]: 2026-01-26 15:47:37.398 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:47:36 up  1:42,  0 user,  load average: 0.66, 0.41, 0.27\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:47:37 compute-1 nova_compute[183403]: 2026-01-26 15:47:37.439 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:47:37 compute-1 nova_compute[183403]: 2026-01-26 15:47:37.637 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:47:37 compute-1 nova_compute[183403]: 2026-01-26 15:47:37.947 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:47:38 compute-1 nova_compute[183403]: 2026-01-26 15:47:38.478 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:47:38 compute-1 nova_compute[183403]: 2026-01-26 15:47:38.478 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.190s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:47:39 compute-1 nova_compute[183403]: 2026-01-26 15:47:39.571 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:47:40 compute-1 nova_compute[183403]: 2026-01-26 15:47:40.478 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:47:41 compute-1 nova_compute[183403]: 2026-01-26 15:47:41.577 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:47:41 compute-1 nova_compute[183403]: 2026-01-26 15:47:41.577 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 15:47:42 compute-1 nova_compute[183403]: 2026-01-26 15:47:42.636 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:47:43 compute-1 nova_compute[183403]: 2026-01-26 15:47:43.578 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:47:43 compute-1 nova_compute[183403]: 2026-01-26 15:47:43.579 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:47:44 compute-1 nova_compute[183403]: 2026-01-26 15:47:44.572 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:47:44 compute-1 nova_compute[183403]: 2026-01-26 15:47:44.573 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:47:44 compute-1 nova_compute[183403]: 2026-01-26 15:47:44.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:47:47 compute-1 nova_compute[183403]: 2026-01-26 15:47:47.638 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:47:49 compute-1 openstack_network_exporter[195610]: ERROR   15:47:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:47:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:47:49 compute-1 openstack_network_exporter[195610]: ERROR   15:47:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:47:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:47:49 compute-1 nova_compute[183403]: 2026-01-26 15:47:49.576 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:47:52 compute-1 nova_compute[183403]: 2026-01-26 15:47:52.652 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:47:54 compute-1 nova_compute[183403]: 2026-01-26 15:47:54.578 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:47:55 compute-1 podman[220344]: 2026-01-26 15:47:55.917850534 +0000 UTC m=+0.090150783 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 26 15:47:55 compute-1 podman[220345]: 2026-01-26 15:47:55.922289333 +0000 UTC m=+0.093197395 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, config_id=openstack_network_exporter, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., name=ubi9-minimal)
Jan 26 15:47:57 compute-1 nova_compute[183403]: 2026-01-26 15:47:57.653 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:47:59 compute-1 nova_compute[183403]: 2026-01-26 15:47:59.582 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:48:02 compute-1 nova_compute[183403]: 2026-01-26 15:48:02.706 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:48:04 compute-1 nova_compute[183403]: 2026-01-26 15:48:04.584 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:48:05 compute-1 podman[192725]: time="2026-01-26T15:48:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:48:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:48:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:48:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:48:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2194 "" "Go-http-client/1.1"
Jan 26 15:48:05 compute-1 podman[220390]: 2026-01-26 15:48:05.89736435 +0000 UTC m=+0.062967950 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Jan 26 15:48:05 compute-1 podman[220389]: 2026-01-26 15:48:05.959513096 +0000 UTC m=+0.135801505 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Jan 26 15:48:07 compute-1 nova_compute[183403]: 2026-01-26 15:48:07.707 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:48:09 compute-1 nova_compute[183403]: 2026-01-26 15:48:09.587 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:48:12 compute-1 nova_compute[183403]: 2026-01-26 15:48:12.768 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:48:14 compute-1 nova_compute[183403]: 2026-01-26 15:48:14.589 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:48:17 compute-1 nova_compute[183403]: 2026-01-26 15:48:17.771 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:48:19 compute-1 openstack_network_exporter[195610]: ERROR   15:48:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:48:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:48:19 compute-1 openstack_network_exporter[195610]: ERROR   15:48:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:48:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:48:19 compute-1 nova_compute[183403]: 2026-01-26 15:48:19.591 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:48:22 compute-1 nova_compute[183403]: 2026-01-26 15:48:22.773 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:48:24 compute-1 nova_compute[183403]: 2026-01-26 15:48:24.594 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:48:26 compute-1 podman[220435]: 2026-01-26 15:48:26.890281478 +0000 UTC m=+0.061861111 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, name=ubi9-minimal, vcs-type=git, container_name=openstack_network_exporter, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, vendor=Red Hat, Inc., config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 26 15:48:26 compute-1 podman[220434]: 2026-01-26 15:48:26.915125808 +0000 UTC m=+0.093515604 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 15:48:27 compute-1 nova_compute[183403]: 2026-01-26 15:48:27.775 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:48:28 compute-1 nova_compute[183403]: 2026-01-26 15:48:28.577 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:48:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:48:29.113 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:48:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:48:29.114 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:48:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:48:29.114 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:48:29 compute-1 nova_compute[183403]: 2026-01-26 15:48:29.596 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:48:32 compute-1 nova_compute[183403]: 2026-01-26 15:48:32.814 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:48:34 compute-1 nova_compute[183403]: 2026-01-26 15:48:34.600 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:48:35 compute-1 podman[192725]: time="2026-01-26T15:48:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:48:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:48:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:48:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:48:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2193 "" "Go-http-client/1.1"
Jan 26 15:48:36 compute-1 podman[220479]: 2026-01-26 15:48:36.897582883 +0000 UTC m=+0.060458823 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 26 15:48:36 compute-1 podman[220478]: 2026-01-26 15:48:36.930669215 +0000 UTC m=+0.106492544 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 26 15:48:37 compute-1 nova_compute[183403]: 2026-01-26 15:48:37.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:48:37 compute-1 nova_compute[183403]: 2026-01-26 15:48:37.816 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:48:38 compute-1 nova_compute[183403]: 2026-01-26 15:48:38.093 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:48:38 compute-1 nova_compute[183403]: 2026-01-26 15:48:38.093 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:48:38 compute-1 nova_compute[183403]: 2026-01-26 15:48:38.094 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:48:38 compute-1 nova_compute[183403]: 2026-01-26 15:48:38.094 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:48:38 compute-1 nova_compute[183403]: 2026-01-26 15:48:38.268 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:48:38 compute-1 nova_compute[183403]: 2026-01-26 15:48:38.269 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:48:38 compute-1 nova_compute[183403]: 2026-01-26 15:48:38.288 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.018s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:48:38 compute-1 nova_compute[183403]: 2026-01-26 15:48:38.288 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5817MB free_disk=73.14408493041992GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:48:38 compute-1 nova_compute[183403]: 2026-01-26 15:48:38.289 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:48:38 compute-1 nova_compute[183403]: 2026-01-26 15:48:38.289 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:48:39 compute-1 nova_compute[183403]: 2026-01-26 15:48:39.445 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:48:39 compute-1 nova_compute[183403]: 2026-01-26 15:48:39.445 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:48:38 up  1:44,  0 user,  load average: 0.24, 0.34, 0.25\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:48:39 compute-1 nova_compute[183403]: 2026-01-26 15:48:39.584 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Refreshing inventories for resource provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Jan 26 15:48:39 compute-1 nova_compute[183403]: 2026-01-26 15:48:39.604 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:48:39 compute-1 nova_compute[183403]: 2026-01-26 15:48:39.694 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Updating ProviderTree inventory for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Jan 26 15:48:39 compute-1 nova_compute[183403]: 2026-01-26 15:48:39.695 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Updating inventory in ProviderTree for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Jan 26 15:48:39 compute-1 nova_compute[183403]: 2026-01-26 15:48:39.713 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Refreshing aggregate associations for resource provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Jan 26 15:48:39 compute-1 nova_compute[183403]: 2026-01-26 15:48:39.737 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Refreshing trait associations for resource provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67, traits: COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE41,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NODE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_MMX,COMPUTE_ACCELERATORS,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_ARCH_X86_64,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE2,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_SOUND_MODEL_USB,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_TIS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_DEVICE_TAGGING,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_CRB,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_SOUND_MODEL_SB16,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_QCOW2,HW_ARCH_X86_64,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SECURITY_TPM_2_0 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Jan 26 15:48:39 compute-1 nova_compute[183403]: 2026-01-26 15:48:39.757 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:48:40 compute-1 nova_compute[183403]: 2026-01-26 15:48:40.267 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:48:40 compute-1 nova_compute[183403]: 2026-01-26 15:48:40.778 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:48:40 compute-1 nova_compute[183403]: 2026-01-26 15:48:40.778 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.489s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:48:42 compute-1 nova_compute[183403]: 2026-01-26 15:48:42.779 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:48:42 compute-1 nova_compute[183403]: 2026-01-26 15:48:42.817 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:48:43 compute-1 nova_compute[183403]: 2026-01-26 15:48:43.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:48:43 compute-1 nova_compute[183403]: 2026-01-26 15:48:43.577 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:48:43 compute-1 nova_compute[183403]: 2026-01-26 15:48:43.577 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 15:48:44 compute-1 nova_compute[183403]: 2026-01-26 15:48:44.607 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:48:45 compute-1 nova_compute[183403]: 2026-01-26 15:48:45.577 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:48:46 compute-1 nova_compute[183403]: 2026-01-26 15:48:46.571 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:48:46 compute-1 nova_compute[183403]: 2026-01-26 15:48:46.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:48:47 compute-1 nova_compute[183403]: 2026-01-26 15:48:47.820 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:48:49 compute-1 openstack_network_exporter[195610]: ERROR   15:48:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:48:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:48:49 compute-1 openstack_network_exporter[195610]: ERROR   15:48:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:48:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:48:49 compute-1 nova_compute[183403]: 2026-01-26 15:48:49.609 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:48:52 compute-1 nova_compute[183403]: 2026-01-26 15:48:52.822 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:48:54 compute-1 nova_compute[183403]: 2026-01-26 15:48:54.611 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:48:57 compute-1 nova_compute[183403]: 2026-01-26 15:48:57.822 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:48:57 compute-1 podman[220522]: 2026-01-26 15:48:57.889867107 +0000 UTC m=+0.059839125 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 15:48:57 compute-1 podman[220523]: 2026-01-26 15:48:57.921358177 +0000 UTC m=+0.088542610 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, version=9.6, io.openshift.tags=minimal rhel9, architecture=x86_64, maintainer=Red Hat, Inc., name=ubi9-minimal, release=1755695350, io.buildah.version=1.33.7, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 26 15:48:59 compute-1 nova_compute[183403]: 2026-01-26 15:48:59.614 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:49:01 compute-1 sshd-session[220567]: banner exchange: Connection from 3.132.23.201 port 42402: invalid format
Jan 26 15:49:02 compute-1 nova_compute[183403]: 2026-01-26 15:49:02.571 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:49:02 compute-1 nova_compute[183403]: 2026-01-26 15:49:02.826 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:49:04 compute-1 nova_compute[183403]: 2026-01-26 15:49:04.616 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:49:05 compute-1 podman[192725]: time="2026-01-26T15:49:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:49:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:49:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:49:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:49:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2194 "" "Go-http-client/1.1"
Jan 26 15:49:07 compute-1 nova_compute[183403]: 2026-01-26 15:49:07.828 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:49:07 compute-1 podman[220569]: 2026-01-26 15:49:07.909882995 +0000 UTC m=+0.068763417 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Jan 26 15:49:07 compute-1 podman[220568]: 2026-01-26 15:49:07.980471519 +0000 UTC m=+0.150091091 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_controller, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4)
Jan 26 15:49:09 compute-1 nova_compute[183403]: 2026-01-26 15:49:09.618 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:49:12 compute-1 nova_compute[183403]: 2026-01-26 15:49:12.830 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:49:14 compute-1 nova_compute[183403]: 2026-01-26 15:49:14.621 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:49:17 compute-1 nova_compute[183403]: 2026-01-26 15:49:17.865 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:49:19 compute-1 openstack_network_exporter[195610]: ERROR   15:49:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:49:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:49:19 compute-1 openstack_network_exporter[195610]: ERROR   15:49:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:49:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:49:19 compute-1 nova_compute[183403]: 2026-01-26 15:49:19.623 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:49:22 compute-1 nova_compute[183403]: 2026-01-26 15:49:22.866 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:49:24 compute-1 nova_compute[183403]: 2026-01-26 15:49:24.631 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:49:27 compute-1 nova_compute[183403]: 2026-01-26 15:49:27.868 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:49:27 compute-1 sshd-session[220613]: Connection closed by 3.132.23.201 port 46394 [preauth]
Jan 26 15:49:28 compute-1 podman[220615]: 2026-01-26 15:49:28.910550733 +0000 UTC m=+0.077414270 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 15:49:28 compute-1 podman[220616]: 2026-01-26 15:49:28.932007781 +0000 UTC m=+0.098361305 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, managed_by=edpm_ansible, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, vcs-type=git, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 26 15:49:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:49:29.115 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:49:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:49:29.115 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:49:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:49:29.115 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:49:29 compute-1 nova_compute[183403]: 2026-01-26 15:49:29.668 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:49:30 compute-1 nova_compute[183403]: 2026-01-26 15:49:30.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:49:32 compute-1 nova_compute[183403]: 2026-01-26 15:49:32.872 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:49:34 compute-1 nova_compute[183403]: 2026-01-26 15:49:34.671 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:49:35 compute-1 podman[192725]: time="2026-01-26T15:49:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:49:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:49:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:49:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:49:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2198 "" "Go-http-client/1.1"
Jan 26 15:49:37 compute-1 nova_compute[183403]: 2026-01-26 15:49:37.875 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:49:38 compute-1 nova_compute[183403]: 2026-01-26 15:49:38.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:49:38 compute-1 podman[220663]: 2026-01-26 15:49:38.893165602 +0000 UTC m=+0.068667705 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Jan 26 15:49:38 compute-1 podman[220662]: 2026-01-26 15:49:38.989515891 +0000 UTC m=+0.167892292 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_controller, io.buildah.version=1.41.4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 26 15:49:39 compute-1 nova_compute[183403]: 2026-01-26 15:49:39.104 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:49:39 compute-1 nova_compute[183403]: 2026-01-26 15:49:39.104 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:49:39 compute-1 nova_compute[183403]: 2026-01-26 15:49:39.104 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:49:39 compute-1 nova_compute[183403]: 2026-01-26 15:49:39.104 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:49:39 compute-1 nova_compute[183403]: 2026-01-26 15:49:39.253 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:49:39 compute-1 nova_compute[183403]: 2026-01-26 15:49:39.254 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:49:39 compute-1 nova_compute[183403]: 2026-01-26 15:49:39.270 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.015s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:49:39 compute-1 nova_compute[183403]: 2026-01-26 15:49:39.270 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5825MB free_disk=73.14017868041992GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:49:39 compute-1 nova_compute[183403]: 2026-01-26 15:49:39.270 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:49:39 compute-1 nova_compute[183403]: 2026-01-26 15:49:39.271 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:49:39 compute-1 nova_compute[183403]: 2026-01-26 15:49:39.673 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:49:40 compute-1 nova_compute[183403]: 2026-01-26 15:49:40.412 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:49:40 compute-1 nova_compute[183403]: 2026-01-26 15:49:40.413 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:49:39 up  1:45,  0 user,  load average: 0.08, 0.27, 0.23\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:49:40 compute-1 nova_compute[183403]: 2026-01-26 15:49:40.432 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:49:40 compute-1 nova_compute[183403]: 2026-01-26 15:49:40.944 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:49:41 compute-1 nova_compute[183403]: 2026-01-26 15:49:41.460 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:49:41 compute-1 nova_compute[183403]: 2026-01-26 15:49:41.460 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.190s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:49:42 compute-1 nova_compute[183403]: 2026-01-26 15:49:42.460 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:49:42 compute-1 nova_compute[183403]: 2026-01-26 15:49:42.877 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:49:43 compute-1 nova_compute[183403]: 2026-01-26 15:49:43.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:49:43 compute-1 nova_compute[183403]: 2026-01-26 15:49:43.577 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 15:49:44 compute-1 nova_compute[183403]: 2026-01-26 15:49:44.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:49:44 compute-1 nova_compute[183403]: 2026-01-26 15:49:44.716 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:49:45 compute-1 nova_compute[183403]: 2026-01-26 15:49:45.577 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:49:47 compute-1 nova_compute[183403]: 2026-01-26 15:49:47.572 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:49:47 compute-1 nova_compute[183403]: 2026-01-26 15:49:47.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:49:47 compute-1 nova_compute[183403]: 2026-01-26 15:49:47.919 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:49:49 compute-1 openstack_network_exporter[195610]: ERROR   15:49:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:49:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:49:49 compute-1 openstack_network_exporter[195610]: ERROR   15:49:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:49:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:49:49 compute-1 nova_compute[183403]: 2026-01-26 15:49:49.717 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:49:52 compute-1 nova_compute[183403]: 2026-01-26 15:49:52.919 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:49:54 compute-1 nova_compute[183403]: 2026-01-26 15:49:54.721 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:49:57 compute-1 nova_compute[183403]: 2026-01-26 15:49:57.921 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:49:59 compute-1 nova_compute[183403]: 2026-01-26 15:49:59.768 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:49:59 compute-1 podman[220709]: 2026-01-26 15:49:59.891379064 +0000 UTC m=+0.071365847 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, build-date=2025-08-20T13:12:41, name=ubi9-minimal, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.component=ubi9-minimal-container, distribution-scope=public, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, container_name=openstack_network_exporter, vendor=Red Hat, Inc., architecture=x86_64)
Jan 26 15:49:59 compute-1 podman[220708]: 2026-01-26 15:49:59.918431304 +0000 UTC m=+0.091867880 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 15:50:02 compute-1 nova_compute[183403]: 2026-01-26 15:50:02.926 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:50:04 compute-1 nova_compute[183403]: 2026-01-26 15:50:04.772 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:50:05 compute-1 podman[192725]: time="2026-01-26T15:50:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:50:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:50:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:50:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:50:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2192 "" "Go-http-client/1.1"
Jan 26 15:50:07 compute-1 nova_compute[183403]: 2026-01-26 15:50:07.927 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:50:09 compute-1 nova_compute[183403]: 2026-01-26 15:50:09.774 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:50:09 compute-1 podman[220755]: 2026-01-26 15:50:09.905796741 +0000 UTC m=+0.085209290 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 26 15:50:09 compute-1 podman[220754]: 2026-01-26 15:50:09.936548581 +0000 UTC m=+0.118594011 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 26 15:50:12 compute-1 nova_compute[183403]: 2026-01-26 15:50:12.930 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:50:14 compute-1 nova_compute[183403]: 2026-01-26 15:50:14.777 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:50:17 compute-1 nova_compute[183403]: 2026-01-26 15:50:17.933 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:50:19 compute-1 openstack_network_exporter[195610]: ERROR   15:50:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:50:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:50:19 compute-1 openstack_network_exporter[195610]: ERROR   15:50:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:50:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:50:19 compute-1 nova_compute[183403]: 2026-01-26 15:50:19.777 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:50:22 compute-1 nova_compute[183403]: 2026-01-26 15:50:22.950 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:50:24 compute-1 nova_compute[183403]: 2026-01-26 15:50:24.781 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:50:27 compute-1 nova_compute[183403]: 2026-01-26 15:50:27.952 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:50:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:50:29.116 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:50:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:50:29.116 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:50:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:50:29.116 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:50:29 compute-1 nova_compute[183403]: 2026-01-26 15:50:29.820 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:50:30 compute-1 podman[220797]: 2026-01-26 15:50:30.894757431 +0000 UTC m=+0.070872934 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 15:50:30 compute-1 podman[220798]: 2026-01-26 15:50:30.926091626 +0000 UTC m=+0.090542364 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., version=9.6, io.buildah.version=1.33.7, managed_by=edpm_ansible, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, name=ubi9-minimal, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, release=1755695350)
Jan 26 15:50:31 compute-1 nova_compute[183403]: 2026-01-26 15:50:31.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:50:31 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:50:31.698 104930 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=40, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:ac:38', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'de:96:40:90:f3:e5'}, ipsec=False) old=SB_Global(nb_cfg=39) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 15:50:31 compute-1 nova_compute[183403]: 2026-01-26 15:50:31.699 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:50:31 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:50:31.700 104930 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 15:50:32 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:50:32.701 104930 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=41380b5a-e321-4ce4-bcc6-ecd563b3c793, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '40'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 15:50:32 compute-1 nova_compute[183403]: 2026-01-26 15:50:32.955 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:50:34 compute-1 nova_compute[183403]: 2026-01-26 15:50:34.823 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:50:35 compute-1 podman[192725]: time="2026-01-26T15:50:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:50:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:50:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:50:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:50:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2192 "" "Go-http-client/1.1"
Jan 26 15:50:37 compute-1 nova_compute[183403]: 2026-01-26 15:50:37.984 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:50:39 compute-1 nova_compute[183403]: 2026-01-26 15:50:39.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:50:39 compute-1 nova_compute[183403]: 2026-01-26 15:50:39.826 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:50:40 compute-1 nova_compute[183403]: 2026-01-26 15:50:40.685 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:50:40 compute-1 nova_compute[183403]: 2026-01-26 15:50:40.686 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:50:40 compute-1 nova_compute[183403]: 2026-01-26 15:50:40.686 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:50:40 compute-1 nova_compute[183403]: 2026-01-26 15:50:40.686 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:50:40 compute-1 nova_compute[183403]: 2026-01-26 15:50:40.889 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:50:40 compute-1 nova_compute[183403]: 2026-01-26 15:50:40.890 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:50:40 compute-1 nova_compute[183403]: 2026-01-26 15:50:40.920 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.029s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:50:40 compute-1 nova_compute[183403]: 2026-01-26 15:50:40.921 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5821MB free_disk=73.14059066772461GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:50:40 compute-1 nova_compute[183403]: 2026-01-26 15:50:40.921 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:50:40 compute-1 nova_compute[183403]: 2026-01-26 15:50:40.922 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:50:40 compute-1 podman[220842]: 2026-01-26 15:50:40.942771825 +0000 UTC m=+0.096180737 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4)
Jan 26 15:50:40 compute-1 podman[220841]: 2026-01-26 15:50:40.9859458 +0000 UTC m=+0.148371855 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 15:50:42 compute-1 nova_compute[183403]: 2026-01-26 15:50:42.024 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:50:42 compute-1 nova_compute[183403]: 2026-01-26 15:50:42.024 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:50:40 up  1:46,  0 user,  load average: 0.03, 0.22, 0.21\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:50:42 compute-1 nova_compute[183403]: 2026-01-26 15:50:42.057 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:50:42 compute-1 nova_compute[183403]: 2026-01-26 15:50:42.566 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:50:42 compute-1 nova_compute[183403]: 2026-01-26 15:50:42.985 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:50:43 compute-1 nova_compute[183403]: 2026-01-26 15:50:43.147 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:50:43 compute-1 nova_compute[183403]: 2026-01-26 15:50:43.147 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.225s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:50:44 compute-1 nova_compute[183403]: 2026-01-26 15:50:44.828 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:50:45 compute-1 nova_compute[183403]: 2026-01-26 15:50:45.148 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:50:45 compute-1 nova_compute[183403]: 2026-01-26 15:50:45.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:50:45 compute-1 nova_compute[183403]: 2026-01-26 15:50:45.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:50:45 compute-1 nova_compute[183403]: 2026-01-26 15:50:45.577 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 15:50:46 compute-1 nova_compute[183403]: 2026-01-26 15:50:46.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:50:47 compute-1 nova_compute[183403]: 2026-01-26 15:50:47.987 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:50:48 compute-1 nova_compute[183403]: 2026-01-26 15:50:48.571 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:50:48 compute-1 nova_compute[183403]: 2026-01-26 15:50:48.575 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:50:49 compute-1 openstack_network_exporter[195610]: ERROR   15:50:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:50:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:50:49 compute-1 openstack_network_exporter[195610]: ERROR   15:50:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:50:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:50:49 compute-1 nova_compute[183403]: 2026-01-26 15:50:49.867 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:50:52 compute-1 nova_compute[183403]: 2026-01-26 15:50:52.990 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:50:54 compute-1 nova_compute[183403]: 2026-01-26 15:50:54.870 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:50:57 compute-1 nova_compute[183403]: 2026-01-26 15:50:57.992 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:50:59 compute-1 nova_compute[183403]: 2026-01-26 15:50:59.873 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:51:01 compute-1 podman[220884]: 2026-01-26 15:51:01.920695003 +0000 UTC m=+0.092953910 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 26 15:51:01 compute-1 podman[220885]: 2026-01-26 15:51:01.925742589 +0000 UTC m=+0.091978593 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.expose-services=, io.buildah.version=1.33.7, name=ubi9-minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 26 15:51:02 compute-1 nova_compute[183403]: 2026-01-26 15:51:02.992 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:51:04 compute-1 nova_compute[183403]: 2026-01-26 15:51:04.875 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:51:05 compute-1 podman[192725]: time="2026-01-26T15:51:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:51:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:51:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:51:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:51:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2197 "" "Go-http-client/1.1"
Jan 26 15:51:07 compute-1 nova_compute[183403]: 2026-01-26 15:51:07.572 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:51:07 compute-1 nova_compute[183403]: 2026-01-26 15:51:07.995 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:51:09 compute-1 nova_compute[183403]: 2026-01-26 15:51:09.878 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:51:11 compute-1 podman[220931]: 2026-01-26 15:51:11.950112351 +0000 UTC m=+0.107628616 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4)
Jan 26 15:51:11 compute-1 podman[220930]: 2026-01-26 15:51:11.967386967 +0000 UTC m=+0.132150928 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 26 15:51:13 compute-1 nova_compute[183403]: 2026-01-26 15:51:13.028 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:51:14 compute-1 nova_compute[183403]: 2026-01-26 15:51:14.881 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:51:18 compute-1 nova_compute[183403]: 2026-01-26 15:51:18.031 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:51:19 compute-1 openstack_network_exporter[195610]: ERROR   15:51:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:51:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:51:19 compute-1 openstack_network_exporter[195610]: ERROR   15:51:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:51:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:51:19 compute-1 nova_compute[183403]: 2026-01-26 15:51:19.884 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:51:23 compute-1 nova_compute[183403]: 2026-01-26 15:51:23.033 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:51:24 compute-1 nova_compute[183403]: 2026-01-26 15:51:24.887 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:51:28 compute-1 nova_compute[183403]: 2026-01-26 15:51:28.036 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:51:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:51:29.117 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:51:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:51:29.118 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:51:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:51:29.118 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:51:29 compute-1 nova_compute[183403]: 2026-01-26 15:51:29.889 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:51:31 compute-1 nova_compute[183403]: 2026-01-26 15:51:31.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:51:32 compute-1 podman[220979]: 2026-01-26 15:51:32.921787864 +0000 UTC m=+0.087629576 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 15:51:32 compute-1 podman[220980]: 2026-01-26 15:51:32.952831521 +0000 UTC m=+0.119006912 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, distribution-scope=public, name=ubi9-minimal, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., vcs-type=git, vendor=Red Hat, Inc., version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 26 15:51:33 compute-1 nova_compute[183403]: 2026-01-26 15:51:33.038 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:51:34 compute-1 nova_compute[183403]: 2026-01-26 15:51:34.893 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:51:35 compute-1 podman[192725]: time="2026-01-26T15:51:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:51:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:51:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:51:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:51:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2190 "" "Go-http-client/1.1"
Jan 26 15:51:38 compute-1 nova_compute[183403]: 2026-01-26 15:51:38.093 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:51:38 compute-1 nova_compute[183403]: 2026-01-26 15:51:38.577 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:51:38 compute-1 nova_compute[183403]: 2026-01-26 15:51:38.578 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Jan 26 15:51:39 compute-1 nova_compute[183403]: 2026-01-26 15:51:39.896 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:51:41 compute-1 nova_compute[183403]: 2026-01-26 15:51:41.093 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:51:41 compute-1 nova_compute[183403]: 2026-01-26 15:51:41.612 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:51:41 compute-1 nova_compute[183403]: 2026-01-26 15:51:41.612 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:51:41 compute-1 nova_compute[183403]: 2026-01-26 15:51:41.613 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:51:41 compute-1 nova_compute[183403]: 2026-01-26 15:51:41.613 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:51:41 compute-1 nova_compute[183403]: 2026-01-26 15:51:41.837 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:51:41 compute-1 nova_compute[183403]: 2026-01-26 15:51:41.839 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:51:41 compute-1 nova_compute[183403]: 2026-01-26 15:51:41.871 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.032s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:51:41 compute-1 nova_compute[183403]: 2026-01-26 15:51:41.873 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5828MB free_disk=73.14013671875GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:51:41 compute-1 nova_compute[183403]: 2026-01-26 15:51:41.873 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:51:41 compute-1 nova_compute[183403]: 2026-01-26 15:51:41.874 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:51:42 compute-1 podman[221027]: 2026-01-26 15:51:42.879014093 +0000 UTC m=+0.051939212 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, org.label-schema.build-date=20260120, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Jan 26 15:51:42 compute-1 nova_compute[183403]: 2026-01-26 15:51:42.959 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:51:42 compute-1 nova_compute[183403]: 2026-01-26 15:51:42.960 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:51:41 up  1:47,  0 user,  load average: 0.01, 0.17, 0.20\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:51:42 compute-1 podman[221026]: 2026-01-26 15:51:42.965550938 +0000 UTC m=+0.139741061 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Jan 26 15:51:42 compute-1 nova_compute[183403]: 2026-01-26 15:51:42.992 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:51:43 compute-1 nova_compute[183403]: 2026-01-26 15:51:43.093 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:51:43 compute-1 nova_compute[183403]: 2026-01-26 15:51:43.504 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:51:44 compute-1 nova_compute[183403]: 2026-01-26 15:51:44.022 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:51:44 compute-1 nova_compute[183403]: 2026-01-26 15:51:44.022 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.148s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:51:44 compute-1 nova_compute[183403]: 2026-01-26 15:51:44.023 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:51:44 compute-1 nova_compute[183403]: 2026-01-26 15:51:44.023 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Jan 26 15:51:44 compute-1 nova_compute[183403]: 2026-01-26 15:51:44.535 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Jan 26 15:51:44 compute-1 nova_compute[183403]: 2026-01-26 15:51:44.900 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:51:47 compute-1 nova_compute[183403]: 2026-01-26 15:51:47.019 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:51:47 compute-1 nova_compute[183403]: 2026-01-26 15:51:47.020 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:51:47 compute-1 nova_compute[183403]: 2026-01-26 15:51:47.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:51:47 compute-1 nova_compute[183403]: 2026-01-26 15:51:47.577 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:51:47 compute-1 nova_compute[183403]: 2026-01-26 15:51:47.577 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 15:51:48 compute-1 nova_compute[183403]: 2026-01-26 15:51:48.141 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:51:49 compute-1 openstack_network_exporter[195610]: ERROR   15:51:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:51:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:51:49 compute-1 openstack_network_exporter[195610]: ERROR   15:51:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:51:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:51:49 compute-1 nova_compute[183403]: 2026-01-26 15:51:49.902 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:51:50 compute-1 nova_compute[183403]: 2026-01-26 15:51:50.572 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:51:50 compute-1 nova_compute[183403]: 2026-01-26 15:51:50.575 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:51:53 compute-1 nova_compute[183403]: 2026-01-26 15:51:53.141 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:51:54 compute-1 nova_compute[183403]: 2026-01-26 15:51:54.905 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:51:58 compute-1 nova_compute[183403]: 2026-01-26 15:51:58.207 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:51:59 compute-1 nova_compute[183403]: 2026-01-26 15:51:59.905 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:52:03 compute-1 nova_compute[183403]: 2026-01-26 15:52:03.209 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:52:03 compute-1 podman[221073]: 2026-01-26 15:52:03.873223484 +0000 UTC m=+0.052022205 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 15:52:03 compute-1 podman[221074]: 2026-01-26 15:52:03.91085319 +0000 UTC m=+0.086453255 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, version=9.6, build-date=2025-08-20T13:12:41, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container)
Jan 26 15:52:04 compute-1 nova_compute[183403]: 2026-01-26 15:52:04.908 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:52:05 compute-1 podman[192725]: time="2026-01-26T15:52:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:52:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:52:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:52:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:52:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2193 "" "Go-http-client/1.1"
Jan 26 15:52:08 compute-1 nova_compute[183403]: 2026-01-26 15:52:08.212 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:52:09 compute-1 nova_compute[183403]: 2026-01-26 15:52:09.911 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:52:13 compute-1 nova_compute[183403]: 2026-01-26 15:52:13.212 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:52:13 compute-1 podman[221117]: 2026-01-26 15:52:13.882953771 +0000 UTC m=+0.051091330 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 26 15:52:13 compute-1 podman[221116]: 2026-01-26 15:52:13.920681609 +0000 UTC m=+0.091931292 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Jan 26 15:52:14 compute-1 nova_compute[183403]: 2026-01-26 15:52:14.913 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:52:17 compute-1 nova_compute[183403]: 2026-01-26 15:52:17.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:52:18 compute-1 nova_compute[183403]: 2026-01-26 15:52:18.213 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:52:19 compute-1 openstack_network_exporter[195610]: ERROR   15:52:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:52:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:52:19 compute-1 openstack_network_exporter[195610]: ERROR   15:52:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:52:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:52:19 compute-1 nova_compute[183403]: 2026-01-26 15:52:19.914 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:52:23 compute-1 nova_compute[183403]: 2026-01-26 15:52:23.219 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:52:24 compute-1 nova_compute[183403]: 2026-01-26 15:52:24.920 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:52:28 compute-1 nova_compute[183403]: 2026-01-26 15:52:28.274 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:52:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:52:29.120 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:52:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:52:29.122 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:52:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:52:29.122 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:52:29 compute-1 nova_compute[183403]: 2026-01-26 15:52:29.923 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:52:33 compute-1 nova_compute[183403]: 2026-01-26 15:52:33.275 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:52:34 compute-1 nova_compute[183403]: 2026-01-26 15:52:34.093 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:52:34 compute-1 podman[221162]: 2026-01-26 15:52:34.894626083 +0000 UTC m=+0.069248390 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 15:52:34 compute-1 podman[221163]: 2026-01-26 15:52:34.903644926 +0000 UTC m=+0.078791087 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, io.buildah.version=1.33.7, vcs-type=git, distribution-scope=public, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, release=1755695350)
Jan 26 15:52:34 compute-1 nova_compute[183403]: 2026-01-26 15:52:34.926 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:52:35 compute-1 podman[192725]: time="2026-01-26T15:52:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:52:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:52:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:52:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:52:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2197 "" "Go-http-client/1.1"
Jan 26 15:52:38 compute-1 nova_compute[183403]: 2026-01-26 15:52:38.278 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:52:39 compute-1 nova_compute[183403]: 2026-01-26 15:52:39.929 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:52:40 compute-1 nova_compute[183403]: 2026-01-26 15:52:40.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:52:41 compute-1 nova_compute[183403]: 2026-01-26 15:52:41.092 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:52:41 compute-1 nova_compute[183403]: 2026-01-26 15:52:41.093 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:52:41 compute-1 nova_compute[183403]: 2026-01-26 15:52:41.093 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:52:41 compute-1 nova_compute[183403]: 2026-01-26 15:52:41.093 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:52:41 compute-1 nova_compute[183403]: 2026-01-26 15:52:41.239 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:52:41 compute-1 nova_compute[183403]: 2026-01-26 15:52:41.240 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:52:41 compute-1 nova_compute[183403]: 2026-01-26 15:52:41.263 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.023s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:52:41 compute-1 nova_compute[183403]: 2026-01-26 15:52:41.264 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5825MB free_disk=73.14013671875GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:52:41 compute-1 nova_compute[183403]: 2026-01-26 15:52:41.264 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:52:41 compute-1 nova_compute[183403]: 2026-01-26 15:52:41.265 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:52:42 compute-1 nova_compute[183403]: 2026-01-26 15:52:42.313 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:52:42 compute-1 nova_compute[183403]: 2026-01-26 15:52:42.314 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:52:41 up  1:48,  0 user,  load average: 0.00, 0.14, 0.18\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:52:42 compute-1 nova_compute[183403]: 2026-01-26 15:52:42.347 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:52:42 compute-1 nova_compute[183403]: 2026-01-26 15:52:42.861 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:52:43 compute-1 nova_compute[183403]: 2026-01-26 15:52:43.282 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:52:43 compute-1 nova_compute[183403]: 2026-01-26 15:52:43.443 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:52:43 compute-1 nova_compute[183403]: 2026-01-26 15:52:43.444 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.178s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:52:44 compute-1 podman[221209]: 2026-01-26 15:52:44.884422693 +0000 UTC m=+0.060188476 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20260120, tcib_managed=true)
Jan 26 15:52:44 compute-1 podman[221208]: 2026-01-26 15:52:44.910125506 +0000 UTC m=+0.092126508 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Jan 26 15:52:44 compute-1 nova_compute[183403]: 2026-01-26 15:52:44.930 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:52:47 compute-1 nova_compute[183403]: 2026-01-26 15:52:47.444 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:52:47 compute-1 nova_compute[183403]: 2026-01-26 15:52:47.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:52:47 compute-1 nova_compute[183403]: 2026-01-26 15:52:47.577 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:52:47 compute-1 nova_compute[183403]: 2026-01-26 15:52:47.577 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 15:52:48 compute-1 nova_compute[183403]: 2026-01-26 15:52:48.285 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:52:48 compute-1 nova_compute[183403]: 2026-01-26 15:52:48.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:52:49 compute-1 openstack_network_exporter[195610]: ERROR   15:52:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:52:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:52:49 compute-1 openstack_network_exporter[195610]: ERROR   15:52:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:52:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:52:49 compute-1 nova_compute[183403]: 2026-01-26 15:52:49.932 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:52:50 compute-1 nova_compute[183403]: 2026-01-26 15:52:50.572 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:52:50 compute-1 nova_compute[183403]: 2026-01-26 15:52:50.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:52:53 compute-1 nova_compute[183403]: 2026-01-26 15:52:53.287 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:52:54 compute-1 nova_compute[183403]: 2026-01-26 15:52:54.935 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:52:58 compute-1 nova_compute[183403]: 2026-01-26 15:52:58.290 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:52:59 compute-1 nova_compute[183403]: 2026-01-26 15:52:59.938 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:53:03 compute-1 nova_compute[183403]: 2026-01-26 15:53:03.293 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:53:04 compute-1 nova_compute[183403]: 2026-01-26 15:53:04.941 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:53:05 compute-1 podman[192725]: time="2026-01-26T15:53:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:53:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:53:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:53:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:53:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2195 "" "Go-http-client/1.1"
Jan 26 15:53:05 compute-1 podman[221251]: 2026-01-26 15:53:05.910129164 +0000 UTC m=+0.074028049 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., name=ubi9-minimal, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, distribution-scope=public, vcs-type=git, io.buildah.version=1.33.7)
Jan 26 15:53:05 compute-1 podman[221250]: 2026-01-26 15:53:05.912138218 +0000 UTC m=+0.088239743 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 26 15:53:08 compute-1 nova_compute[183403]: 2026-01-26 15:53:08.295 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:53:09 compute-1 nova_compute[183403]: 2026-01-26 15:53:09.572 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:53:09 compute-1 nova_compute[183403]: 2026-01-26 15:53:09.944 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:53:13 compute-1 nova_compute[183403]: 2026-01-26 15:53:13.297 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:53:14 compute-1 nova_compute[183403]: 2026-01-26 15:53:14.977 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:53:15 compute-1 podman[221294]: 2026-01-26 15:53:15.929771318 +0000 UTC m=+0.088381066 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260120, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, io.buildah.version=1.41.4)
Jan 26 15:53:15 compute-1 podman[221293]: 2026-01-26 15:53:15.94133959 +0000 UTC m=+0.103897444 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Jan 26 15:53:18 compute-1 nova_compute[183403]: 2026-01-26 15:53:18.343 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:53:19 compute-1 openstack_network_exporter[195610]: ERROR   15:53:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:53:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:53:19 compute-1 openstack_network_exporter[195610]: ERROR   15:53:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:53:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:53:19 compute-1 nova_compute[183403]: 2026-01-26 15:53:19.980 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:53:23 compute-1 nova_compute[183403]: 2026-01-26 15:53:23.346 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:53:24 compute-1 nova_compute[183403]: 2026-01-26 15:53:24.983 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:53:28 compute-1 nova_compute[183403]: 2026-01-26 15:53:28.347 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:53:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:53:29.123 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:53:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:53:29.123 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:53:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:53:29.124 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:53:29 compute-1 nova_compute[183403]: 2026-01-26 15:53:29.985 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:53:33 compute-1 nova_compute[183403]: 2026-01-26 15:53:33.348 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:53:34 compute-1 nova_compute[183403]: 2026-01-26 15:53:34.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:53:34 compute-1 nova_compute[183403]: 2026-01-26 15:53:34.988 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:53:35 compute-1 podman[192725]: time="2026-01-26T15:53:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:53:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:53:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:53:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:53:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2199 "" "Go-http-client/1.1"
Jan 26 15:53:36 compute-1 podman[221334]: 2026-01-26 15:53:36.914930155 +0000 UTC m=+0.083727941 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 15:53:36 compute-1 podman[221335]: 2026-01-26 15:53:36.926656591 +0000 UTC m=+0.086496955 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., name=ubi9-minimal, distribution-scope=public, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=)
Jan 26 15:53:38 compute-1 nova_compute[183403]: 2026-01-26 15:53:38.351 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:53:39 compute-1 nova_compute[183403]: 2026-01-26 15:53:39.991 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:53:41 compute-1 nova_compute[183403]: 2026-01-26 15:53:41.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:53:42 compute-1 nova_compute[183403]: 2026-01-26 15:53:42.265 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:53:42 compute-1 nova_compute[183403]: 2026-01-26 15:53:42.266 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:53:42 compute-1 nova_compute[183403]: 2026-01-26 15:53:42.267 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:53:42 compute-1 nova_compute[183403]: 2026-01-26 15:53:42.267 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:53:42 compute-1 nova_compute[183403]: 2026-01-26 15:53:42.429 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:53:42 compute-1 nova_compute[183403]: 2026-01-26 15:53:42.430 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:53:42 compute-1 nova_compute[183403]: 2026-01-26 15:53:42.452 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.022s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:53:42 compute-1 nova_compute[183403]: 2026-01-26 15:53:42.453 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5823MB free_disk=73.14013671875GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:53:42 compute-1 nova_compute[183403]: 2026-01-26 15:53:42.453 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:53:42 compute-1 nova_compute[183403]: 2026-01-26 15:53:42.454 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:53:43 compute-1 nova_compute[183403]: 2026-01-26 15:53:43.351 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:53:43 compute-1 nova_compute[183403]: 2026-01-26 15:53:43.582 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:53:43 compute-1 nova_compute[183403]: 2026-01-26 15:53:43.582 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:53:42 up  1:49,  0 user,  load average: 0.00, 0.11, 0.17\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:53:43 compute-1 nova_compute[183403]: 2026-01-26 15:53:43.663 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Refreshing inventories for resource provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Jan 26 15:53:43 compute-1 nova_compute[183403]: 2026-01-26 15:53:43.723 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Updating ProviderTree inventory for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Jan 26 15:53:43 compute-1 nova_compute[183403]: 2026-01-26 15:53:43.724 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Updating inventory in ProviderTree for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Jan 26 15:53:43 compute-1 nova_compute[183403]: 2026-01-26 15:53:43.737 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Refreshing aggregate associations for resource provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Jan 26 15:53:43 compute-1 nova_compute[183403]: 2026-01-26 15:53:43.755 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Refreshing trait associations for resource provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67, traits: COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE41,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NODE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_MMX,COMPUTE_ACCELERATORS,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_ARCH_X86_64,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE2,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_SOUND_MODEL_USB,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_TIS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_DEVICE_TAGGING,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_CRB,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_SOUND_MODEL_SB16,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_QCOW2,HW_ARCH_X86_64,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SECURITY_TPM_2_0 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Jan 26 15:53:43 compute-1 nova_compute[183403]: 2026-01-26 15:53:43.777 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:53:44 compute-1 nova_compute[183403]: 2026-01-26 15:53:44.289 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:53:44 compute-1 nova_compute[183403]: 2026-01-26 15:53:44.804 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:53:44 compute-1 nova_compute[183403]: 2026-01-26 15:53:44.805 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.351s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:53:44 compute-1 nova_compute[183403]: 2026-01-26 15:53:44.994 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:53:46 compute-1 podman[221381]: 2026-01-26 15:53:46.931101177 +0000 UTC m=+0.090363959 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120)
Jan 26 15:53:46 compute-1 podman[221380]: 2026-01-26 15:53:46.975224818 +0000 UTC m=+0.141798698 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 15:53:48 compute-1 nova_compute[183403]: 2026-01-26 15:53:48.354 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:53:49 compute-1 openstack_network_exporter[195610]: ERROR   15:53:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:53:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:53:49 compute-1 openstack_network_exporter[195610]: ERROR   15:53:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:53:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:53:49 compute-1 nova_compute[183403]: 2026-01-26 15:53:49.806 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:53:49 compute-1 nova_compute[183403]: 2026-01-26 15:53:49.807 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:53:49 compute-1 nova_compute[183403]: 2026-01-26 15:53:49.807 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:53:49 compute-1 nova_compute[183403]: 2026-01-26 15:53:49.808 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:53:49 compute-1 nova_compute[183403]: 2026-01-26 15:53:49.808 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 15:53:49 compute-1 nova_compute[183403]: 2026-01-26 15:53:49.995 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:53:50 compute-1 nova_compute[183403]: 2026-01-26 15:53:50.572 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:53:50 compute-1 nova_compute[183403]: 2026-01-26 15:53:50.575 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:53:53 compute-1 nova_compute[183403]: 2026-01-26 15:53:53.358 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:53:55 compute-1 nova_compute[183403]: 2026-01-26 15:53:55.000 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:53:58 compute-1 nova_compute[183403]: 2026-01-26 15:53:58.395 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:54:00 compute-1 nova_compute[183403]: 2026-01-26 15:54:00.002 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:54:03 compute-1 nova_compute[183403]: 2026-01-26 15:54:03.442 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:54:05 compute-1 nova_compute[183403]: 2026-01-26 15:54:05.005 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:54:05 compute-1 podman[192725]: time="2026-01-26T15:54:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:54:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:54:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:54:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:54:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2197 "" "Go-http-client/1.1"
Jan 26 15:54:07 compute-1 podman[221425]: 2026-01-26 15:54:07.909699624 +0000 UTC m=+0.083610077 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, release=1755695350, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, vcs-type=git, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal)
Jan 26 15:54:07 compute-1 podman[221424]: 2026-01-26 15:54:07.919296553 +0000 UTC m=+0.097676947 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 15:54:08 compute-1 nova_compute[183403]: 2026-01-26 15:54:08.444 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:54:10 compute-1 nova_compute[183403]: 2026-01-26 15:54:10.009 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:54:13 compute-1 nova_compute[183403]: 2026-01-26 15:54:13.448 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:54:15 compute-1 nova_compute[183403]: 2026-01-26 15:54:15.012 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:54:17 compute-1 podman[221469]: 2026-01-26 15:54:17.90456431 +0000 UTC m=+0.068640813 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest)
Jan 26 15:54:17 compute-1 podman[221468]: 2026-01-26 15:54:17.988005682 +0000 UTC m=+0.151648844 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Jan 26 15:54:18 compute-1 nova_compute[183403]: 2026-01-26 15:54:18.449 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:54:19 compute-1 openstack_network_exporter[195610]: ERROR   15:54:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:54:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:54:19 compute-1 openstack_network_exporter[195610]: ERROR   15:54:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:54:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:54:20 compute-1 nova_compute[183403]: 2026-01-26 15:54:20.014 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:54:23 compute-1 nova_compute[183403]: 2026-01-26 15:54:23.451 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:54:25 compute-1 nova_compute[183403]: 2026-01-26 15:54:25.017 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:54:28 compute-1 nova_compute[183403]: 2026-01-26 15:54:28.454 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:54:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:54:29.124 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:54:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:54:29.125 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:54:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:54:29.125 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:54:30 compute-1 nova_compute[183403]: 2026-01-26 15:54:30.413 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:54:33 compute-1 nova_compute[183403]: 2026-01-26 15:54:33.454 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:54:35 compute-1 nova_compute[183403]: 2026-01-26 15:54:35.415 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:54:35 compute-1 podman[192725]: time="2026-01-26T15:54:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:54:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:54:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:54:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:54:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2198 "" "Go-http-client/1.1"
Jan 26 15:54:36 compute-1 nova_compute[183403]: 2026-01-26 15:54:36.577 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:54:38 compute-1 nova_compute[183403]: 2026-01-26 15:54:38.457 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:54:38 compute-1 podman[221514]: 2026-01-26 15:54:38.925011118 +0000 UTC m=+0.088209572 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 15:54:38 compute-1 podman[221515]: 2026-01-26 15:54:38.938269925 +0000 UTC m=+0.095061136 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vcs-type=git, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, release=1755695350, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.buildah.version=1.33.7, name=ubi9-minimal, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter)
Jan 26 15:54:40 compute-1 nova_compute[183403]: 2026-01-26 15:54:40.417 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:54:43 compute-1 nova_compute[183403]: 2026-01-26 15:54:43.458 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:54:43 compute-1 nova_compute[183403]: 2026-01-26 15:54:43.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:54:44 compute-1 nova_compute[183403]: 2026-01-26 15:54:44.099 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:54:44 compute-1 nova_compute[183403]: 2026-01-26 15:54:44.100 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:54:44 compute-1 nova_compute[183403]: 2026-01-26 15:54:44.100 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:54:44 compute-1 nova_compute[183403]: 2026-01-26 15:54:44.101 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:54:44 compute-1 nova_compute[183403]: 2026-01-26 15:54:44.300 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:54:44 compute-1 nova_compute[183403]: 2026-01-26 15:54:44.302 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:54:44 compute-1 nova_compute[183403]: 2026-01-26 15:54:44.338 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.036s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:54:44 compute-1 nova_compute[183403]: 2026-01-26 15:54:44.339 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5825MB free_disk=73.14013671875GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:54:44 compute-1 nova_compute[183403]: 2026-01-26 15:54:44.340 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:54:44 compute-1 nova_compute[183403]: 2026-01-26 15:54:44.340 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:54:45 compute-1 nova_compute[183403]: 2026-01-26 15:54:45.420 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:54:46 compute-1 nova_compute[183403]: 2026-01-26 15:54:46.548 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:54:46 compute-1 nova_compute[183403]: 2026-01-26 15:54:46.549 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:54:44 up  1:50,  0 user,  load average: 0.00, 0.09, 0.15\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:54:46 compute-1 nova_compute[183403]: 2026-01-26 15:54:46.924 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:54:47 compute-1 nova_compute[183403]: 2026-01-26 15:54:47.430 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:54:47 compute-1 nova_compute[183403]: 2026-01-26 15:54:47.939 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:54:47 compute-1 nova_compute[183403]: 2026-01-26 15:54:47.940 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.600s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:54:48 compute-1 nova_compute[183403]: 2026-01-26 15:54:48.460 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:54:48 compute-1 podman[221560]: 2026-01-26 15:54:48.917228071 +0000 UTC m=+0.079808105 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 26 15:54:48 compute-1 podman[221559]: 2026-01-26 15:54:48.923939432 +0000 UTC m=+0.105076836 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4)
Jan 26 15:54:49 compute-1 openstack_network_exporter[195610]: ERROR   15:54:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:54:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:54:49 compute-1 openstack_network_exporter[195610]: ERROR   15:54:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:54:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:54:50 compute-1 nova_compute[183403]: 2026-01-26 15:54:50.423 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:54:51 compute-1 nova_compute[183403]: 2026-01-26 15:54:51.941 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:54:51 compute-1 nova_compute[183403]: 2026-01-26 15:54:51.941 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:54:51 compute-1 nova_compute[183403]: 2026-01-26 15:54:51.942 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:54:51 compute-1 nova_compute[183403]: 2026-01-26 15:54:51.942 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:54:51 compute-1 nova_compute[183403]: 2026-01-26 15:54:51.943 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 15:54:52 compute-1 nova_compute[183403]: 2026-01-26 15:54:52.573 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:54:52 compute-1 nova_compute[183403]: 2026-01-26 15:54:52.575 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:54:53 compute-1 nova_compute[183403]: 2026-01-26 15:54:53.463 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:54:55 compute-1 nova_compute[183403]: 2026-01-26 15:54:55.425 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:54:58 compute-1 nova_compute[183403]: 2026-01-26 15:54:58.465 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:55:00 compute-1 nova_compute[183403]: 2026-01-26 15:55:00.428 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:55:03 compute-1 nova_compute[183403]: 2026-01-26 15:55:03.491 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:55:05 compute-1 nova_compute[183403]: 2026-01-26 15:55:05.432 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:55:05 compute-1 podman[192725]: time="2026-01-26T15:55:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:55:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:55:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:55:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:55:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2198 "" "Go-http-client/1.1"
Jan 26 15:55:08 compute-1 nova_compute[183403]: 2026-01-26 15:55:08.493 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:55:09 compute-1 podman[221603]: 2026-01-26 15:55:09.904567482 +0000 UTC m=+0.077781519 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 15:55:09 compute-1 podman[221604]: 2026-01-26 15:55:09.905036394 +0000 UTC m=+0.071899119 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, architecture=x86_64, config_id=openstack_network_exporter, io.openshift.expose-services=, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., managed_by=edpm_ansible, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, name=ubi9-minimal)
Jan 26 15:55:10 compute-1 nova_compute[183403]: 2026-01-26 15:55:10.436 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:55:13 compute-1 nova_compute[183403]: 2026-01-26 15:55:13.494 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:55:13 compute-1 nova_compute[183403]: 2026-01-26 15:55:13.571 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:55:15 compute-1 nova_compute[183403]: 2026-01-26 15:55:15.438 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:55:18 compute-1 nova_compute[183403]: 2026-01-26 15:55:18.495 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:55:19 compute-1 openstack_network_exporter[195610]: ERROR   15:55:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:55:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:55:19 compute-1 openstack_network_exporter[195610]: ERROR   15:55:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:55:19 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:55:19 compute-1 podman[221644]: 2026-01-26 15:55:19.921147405 +0000 UTC m=+0.086346740 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Jan 26 15:55:19 compute-1 podman[221643]: 2026-01-26 15:55:19.957785533 +0000 UTC m=+0.131168659 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller)
Jan 26 15:55:20 compute-1 nova_compute[183403]: 2026-01-26 15:55:20.440 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:55:23 compute-1 nova_compute[183403]: 2026-01-26 15:55:23.499 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:55:25 compute-1 nova_compute[183403]: 2026-01-26 15:55:25.446 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:55:28 compute-1 nova_compute[183403]: 2026-01-26 15:55:28.499 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:55:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:55:29.126 104930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:55:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:55:29.127 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:55:29 compute-1 ovn_metadata_agent[104924]: 2026-01-26 15:55:29.127 104930 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:55:30 compute-1 nova_compute[183403]: 2026-01-26 15:55:30.449 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:55:33 compute-1 nova_compute[183403]: 2026-01-26 15:55:33.500 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:55:35 compute-1 nova_compute[183403]: 2026-01-26 15:55:35.452 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:55:35 compute-1 podman[192725]: time="2026-01-26T15:55:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:55:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:55:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:55:35 compute-1 podman[192725]: @ - - [26/Jan/2026:15:55:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2196 "" "Go-http-client/1.1"
Jan 26 15:55:36 compute-1 nova_compute[183403]: 2026-01-26 15:55:36.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:55:38 compute-1 nova_compute[183403]: 2026-01-26 15:55:38.503 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:55:40 compute-1 nova_compute[183403]: 2026-01-26 15:55:40.458 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:55:40 compute-1 podman[221690]: 2026-01-26 15:55:40.900574281 +0000 UTC m=+0.071735636 container health_status 46ea93d4e9ec98d54272c4272399094fa0d9d3c1ea39cac0ebaab906b698730f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 26 15:55:40 compute-1 podman[221691]: 2026-01-26 15:55:40.925831412 +0000 UTC m=+0.080101202 container health_status 90b4ecba7b8893933065520081ac926ab148c1acd4ec88cab9cbe81cce8ba1c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., container_name=openstack_network_exporter, release=1755695350, architecture=x86_64, version=9.6, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal)
Jan 26 15:55:43 compute-1 nova_compute[183403]: 2026-01-26 15:55:43.507 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:55:44 compute-1 nova_compute[183403]: 2026-01-26 15:55:44.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:55:45 compute-1 nova_compute[183403]: 2026-01-26 15:55:45.135 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:55:45 compute-1 nova_compute[183403]: 2026-01-26 15:55:45.135 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:55:45 compute-1 nova_compute[183403]: 2026-01-26 15:55:45.136 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:55:45 compute-1 nova_compute[183403]: 2026-01-26 15:55:45.136 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 15:55:45 compute-1 nova_compute[183403]: 2026-01-26 15:55:45.307 183407 WARNING nova.virt.libvirt.driver [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 15:55:45 compute-1 nova_compute[183403]: 2026-01-26 15:55:45.309 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 15:55:45 compute-1 nova_compute[183403]: 2026-01-26 15:55:45.326 183407 DEBUG oslo_concurrency.processutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 15:55:45 compute-1 nova_compute[183403]: 2026-01-26 15:55:45.327 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5824MB free_disk=73.14013671875GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 15:55:45 compute-1 nova_compute[183403]: 2026-01-26 15:55:45.328 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 15:55:45 compute-1 nova_compute[183403]: 2026-01-26 15:55:45.329 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 15:55:45 compute-1 nova_compute[183403]: 2026-01-26 15:55:45.463 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:55:46 compute-1 nova_compute[183403]: 2026-01-26 15:55:46.454 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 15:55:46 compute-1 nova_compute[183403]: 2026-01-26 15:55:46.454 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 15:55:45 up  1:51,  0 user,  load average: 0.00, 0.07, 0.14\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 15:55:46 compute-1 nova_compute[183403]: 2026-01-26 15:55:46.856 183407 DEBUG nova.compute.provider_tree [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed in ProviderTree for provider: e3eb07a3-6ab4-4f51-ad76-347430ed2b67 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 15:55:47 compute-1 nova_compute[183403]: 2026-01-26 15:55:47.489 183407 DEBUG nova.scheduler.client.report [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Inventory has not changed for provider e3eb07a3-6ab4-4f51-ad76-347430ed2b67 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 15:55:48 compute-1 nova_compute[183403]: 2026-01-26 15:55:48.006 183407 DEBUG nova.compute.resource_tracker [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 15:55:48 compute-1 nova_compute[183403]: 2026-01-26 15:55:48.006 183407 DEBUG oslo_concurrency.lockutils [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.678s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 15:55:48 compute-1 nova_compute[183403]: 2026-01-26 15:55:48.509 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:55:49 compute-1 openstack_network_exporter[195610]: ERROR   15:55:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 15:55:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:55:49 compute-1 openstack_network_exporter[195610]: ERROR   15:55:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 15:55:49 compute-1 openstack_network_exporter[195610]: 
Jan 26 15:55:50 compute-1 nova_compute[183403]: 2026-01-26 15:55:50.465 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:55:50 compute-1 podman[221737]: 2026-01-26 15:55:50.895050619 +0000 UTC m=+0.064115510 container health_status bb10eddcee0c2131550f0ab9fe3f84b19e662d277234389f27af560254ca7e20 (image=38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 26 15:55:50 compute-1 podman[221736]: 2026-01-26 15:55:50.940808933 +0000 UTC m=+0.123128242 container health_status b55e96bb43bf0a56cd096d801b89b5b5fbd072e30dd86752dee6d75676a06ee1 (image=38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '208364e37a48eacc276c2aff85205e3d7a3fad90e90bc4880a4b3c6fbb0e4af0-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7-251b8cab7a833514c2b98fe237c61d3dc3c8d702d2ceb06b7e1566a4d08e24f7'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.230:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Jan 26 15:55:51 compute-1 nova_compute[183403]: 2026-01-26 15:55:51.007 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:55:51 compute-1 nova_compute[183403]: 2026-01-26 15:55:51.008 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:55:51 compute-1 nova_compute[183403]: 2026-01-26 15:55:51.008 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:55:51 compute-1 nova_compute[183403]: 2026-01-26 15:55:51.009 183407 DEBUG nova.compute.manager [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 15:55:51 compute-1 nova_compute[183403]: 2026-01-26 15:55:51.578 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:55:51 compute-1 nova_compute[183403]: 2026-01-26 15:55:51.578 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_shelved_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:55:52 compute-1 nova_compute[183403]: 2026-01-26 15:55:52.573 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:55:53 compute-1 nova_compute[183403]: 2026-01-26 15:55:53.510 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:55:54 compute-1 nova_compute[183403]: 2026-01-26 15:55:54.576 183407 DEBUG oslo_service.periodic_task [None req-f36b708d-e0e6-497c-b178-4a1a4a6df6d6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 15:55:55 compute-1 nova_compute[183403]: 2026-01-26 15:55:55.507 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:55:57 compute-1 sshd-session[221781]: Accepted publickey for zuul from 192.168.122.10 port 50068 ssh2: ECDSA SHA256:jdvz2lyr5EyIYu+bvBKw53Euf9HIejup115smDZmKeU
Jan 26 15:55:57 compute-1 systemd-logind[795]: New session 43 of user zuul.
Jan 26 15:55:57 compute-1 systemd[1]: Started Session 43 of User zuul.
Jan 26 15:55:57 compute-1 sshd-session[221781]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 15:55:57 compute-1 sudo[221785]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Jan 26 15:55:57 compute-1 sudo[221785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 15:55:58 compute-1 nova_compute[183403]: 2026-01-26 15:55:58.512 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:56:00 compute-1 nova_compute[183403]: 2026-01-26 15:56:00.508 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:56:02 compute-1 ovs-vsctl[221959]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Jan 26 15:56:03 compute-1 nova_compute[183403]: 2026-01-26 15:56:03.514 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:56:03 compute-1 virtqemud[183290]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Jan 26 15:56:03 compute-1 virtqemud[183290]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Jan 26 15:56:03 compute-1 virtqemud[183290]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 26 15:56:04 compute-1 crontab[222362]: (root) LIST (root)
Jan 26 15:56:05 compute-1 nova_compute[183403]: 2026-01-26 15:56:05.512 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:56:05 compute-1 podman[192725]: time="2026-01-26T15:56:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 15:56:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:56:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 26 15:56:05 compute-1 podman[192725]: @ - - [26/Jan/2026:15:56:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2192 "" "Go-http-client/1.1"
Jan 26 15:56:06 compute-1 systemd[1]: Starting Hostname Service...
Jan 26 15:56:07 compute-1 systemd[1]: Started Hostname Service.
Jan 26 15:56:08 compute-1 nova_compute[183403]: 2026-01-26 15:56:08.516 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 15:56:10 compute-1 nova_compute[183403]: 2026-01-26 15:56:10.515 183407 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
